Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator ABDULAZIZ ALSHAER

Size: px
Start display at page:

Download "Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator ABDULAZIZ ALSHAER"

Transcription

1 Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator ABDULAZIZ ALSHAER A thesis submitted for the degree of Master of Applied Science in Software and Knowledge Engineering University of Otago Dunedin, New Zealand December 2012

2 Abstract Exercising in a Virtual Environment as an adjunct training method, can enhance safety, cost-effectiveness, training quality, and can be used as an assessment tool. In the realm of power wheelchair (PWC) simulation a number of systems have been developed over the past few years. Unfortunately, these simulators are: 1) rather simple, in particular lack correct physics simulation, 2) do not support peripheral vision, 3) are not suitable as a standard assessment tool, and 4) are not available commercially, except for one system. This study investigates factors influencing user's driving performance in a PWC simulator. It addresses the central research question: Do peripheral vision and/or stereoscopic viewing have an influence on PWC driving performance? This research directly investigates these issues by treating standard display viewing, peripheral vision, and stereoscopic 3D viewing as independent variables to see which one influences driving performance in a PWC simulator, which was developed for this purpose. This study also compares the users' sense of presence as a side effect and considers their preference for each of the three independent variables. In a randomized within-subject design, 24 participants performed the same navigation tasks across three conditions, namely monoscopic narrow field of view (narrow-fov), monoscopic wide field of view (wide-fov), and stereoscopic narrow field of view (stereo-fov). The number of path collisions, number of wall collisions, and time spent was recorded to measure individual performance, whereas the sense of presence was measured using a standard questionnaire. In terms of driving performance, the results indicate that although the number of path and wall collisions was fewest in the wide FOV condition, there were no significant differences found between the means. However, time spent and overall driving performance in the wide FOV condition was significantly better compared with other conditions. The result of the users' sense of presence assessment shows that the wide-fov and stereo-fov were rated significantly higher by the participants. Further, the wide-fov condition was preferred by 83% of the subjects when they had to choose between the conditions. The wide-fov seems to be a promising interface for complementing training or assessment of PWC users. i

3 ب س م ٱهلل ٱلر ح ن ٱلر ح ي "ال ح م د ل ل ا ل ي ل م ا ف الس م او ات و م ا ف األ ر ض و ل ال ح م د ف األآخ ر ة و ه و ال ح ك ي ال خ ب ر"} 1 { (س ور ة س ب ا) In the name of Allah, the Beneficent, the Merciful All the praises and thanks be to Allah, to Whom belongs all that is in the heavens and all that is in the earth. His is all the praises and thanks in the Hereafter, and He is the All-Wise, the All-Aware (Quran: Verse 34:1, Mohsin Khan Translation) ii

4 Acknowledgment First and foremost, I thank Allah (God) for giving me the strength and ability to complete this study, and for giving me the power to believe in my passion and pursue my dreams. I could never have done this without the faith I have in you. I am sincerely grateful to Assoc. Prof. Holger Regenbrecht for his support, wisdom, guidance, and advice throughout this study. Without his assistance it would have been impossible to write this thesis. Thanks also go to Simon Hormann for his valuable assistance in different parts of the thesis. I would like to thank all of the people who participated in the heuristic evaluation and appreciate their time, and my family at home for their encouragement and motivation during the journey of completing my study. Especial thanks go to my mother for her support and love during my study. Last, but not least, I would like to thank my wife Zainab for her love, understanding and support while she too was busy with her studies. iii

5 Contents Abstract... i Acknowledgment... iii List of Tables... vi List of Figures... vii 1 Introduction Research Questions Motivation Importance of the Study Abbreviation and terms used Thesis Outline Literature Review Virtual Environment Presence and Immersion in VE Display Types in VE Interaction Techniques in VE Power Wheelchair Indoor vs. Outdoors Method of Propulsion History and Development of PWC Simulators Standard Interface Motion Platform Commercialization Summary of the Related Literature Summary and Discussion Research Question and Hypotheses Principle Research Question Hypotheses Heuristic Evaluation Introduction Evaluation Goals Assumptions Overview of WheelSim Methodology Findings Discussion Limitations Conclusion iv

6 4 Conceptual Design and Implementation Overview System Requirements First Phase: Building the Framework Second Phase: Development and implementation Tools Hardware Components Software and Simulator development Methodology and Design Research Variables and Design Independent and Dependent Variables Potentially Confounding Variables The experiment Study Sample Task Environment Experiment Design Study Instruments Procedure and Data Collection Limitations Assumptions Results and Data Analysis Overview Participants Results Users Driving Performance User Sense of Presence Comparison Simulator Sickness Discussion Conclusion and Future Work Conclusion Future Works References Appendix A: Heuristic Evaluation Documents Appendix B: Experiment Documents Appendix C: Experiment Data v

7 List of Tables Table 2.1: Summary of the related works Table 3.1: VE-heuristics Table 3.2: Usability severity codes Table 3.3: Heuristic evaluation results Table 4.1: Hardware specifications Table 4.2: Navigation Methods Table 5.1: IPQ questions and factors Table 5.2: The calculations in the SSQ Table 6.1: Participants information collected via the demographic questionnaire Table 6.2: Tests of normality for path collisions data Table 6.3: Descriptive statistics of path collisions for the three conditions Table 6.4: Tests of normality for wall collisions data Table 6.5: Descriptive statistics of wall collisions for the three conditions Table 6.6: Tests of normality for wall collisions data Table 6.7: Time spent Paired-Samples Test Table 6.8: Tests of normality for wall collisions data Table 6.9: Users driving performance Paired-Samples Test Table 6.10: Tests of normality for sense of presence factors Table 6.11: Users involvement Paired-Samples Test Table 6.12: Users Spatial Presence Paired-Samples Test Table 6.13: Realism Paired-Samples Test Table 6.14: General Presence Paired-Samples Test Table 6.15: Sense of presence factors, summary of the findings Table 6.16: Test of normality for users sense of presence Table 6.17: Overall Sense of Presence Paired-Samples Test Table 6.18: Simulator Sickness Scores Table 6.19: Summary of the Study Results Table 6.20: Summary of the assessment of research hypotheses vi

8 List of Figures Figure 2.1: Display types in VE, pictures were taken from [6]... 9 Figure 2.2: Full immersion category. On the left, CAVE environment (6 sides), HMD in the middle, and boom display on the right, operated by hand on stand. FOV is available in all directions [26] Figure 2.3: Partial immersion category. On the left, desktop monitor, panoramic displays in the middle, and CAVE environment (3-5 sides) on the right. FOV omitted due to the screen size [26] Figure 2.4: Human FOV, designed by the researcher Figure 2.5: DFOV vs. GVOF, created by the author using Google SketchUp; the head was downloaded from Google free 3D warehouse Figure 2.6: Power wheelchair controller, available on the market [29] Figure 2.7: PWC propulsion systems Figure 2.8: Cooper et al. [23] simulator. Left picture shows boundary collision detection with large dot. Right picture shows the PWC and its trajectories Figure 2.9: VEMS simulator [36], indoor environment (hallway left and kitchen right) 18 Figure 2.10: miwe simulator, subjects drive in a first-person view [5] Figure 2.11: In the right picture, the trigger spot turns green when approached by driver [11] Figure 2.12: A screenshot of the CAVE environment video Figure 2.13: Four examples of user trajectories where C shows the ideal path [39] Figure 2.14: VAHM simulation platform [40] Figure 2.15: First version of their driving simulator [7] Figure 2.16: Head posture evaluation while driving a PWC [7] Figure 2.17: Second version of [7] PWC simulator (hemispherical display system) Figure 3.1: the curve shows the proportion of the issues found by evaluators [47], almost 75% of the issues were found with 5 evaluators Figure 3.2: WheelSim simulator, screenshot Figure 3.3: A screenshot of WheelSim four level Figure 3.4: A screenshot of WheelSim statistical report Figure 3.5: One of the evaluators analysing WheelSim Figure 4.1: First attempt with three desktop monitors Figure 4.2: Laminated plastic screens Figure 4.3: Framework design Figure 4.4: Dell Projector 2300MP Figure 4.5: Matrix Adapter vii

9 Figure 4.6: Alienware laptop Figure 4.7: Epson projector EH-TW Figure 4.8: Active shutter 3D glasses Figure 4.9: Joystick used Figure 4.10: PWCsim components architecture Figure 4.11: Left picture is the first version of PWC simulator. Right picture is the environment architecture after adjusting the house dimensions Figure 4.12: Path planning Figure 4.13: The main user interface of Unity Figure 4.14: Left picture represents the first scene (main screen). Right picture represents the second scene (testing screen) Figure 4.15: Left picture shows Narrow-FOV and Stereo-FOV conditions with 90 GFOV. Right pictures shows wide-fov condition (three cameras stitched together 225, 75 each ) Figure 4.16: Matrox Power Desk bezel management Figure 5.1: Order of display types conditions Figure 5.2: directions Figure 5.3: Experiment environment Figure 5.4: a screenshot of three files generated by PWCsim. Top file shows individual s trajectory. Mid file contains all data related to path collisions (participant number, time of occurrence, and X and Z coordinates of the collision). Bottom file contains all subjects performance data for specific version (in this case Stereo-FOV) Figure 5.5: Wide-FOV menu screen Figure 5.6: Participant doing the wide-fov condition Figure 6.1: A boxplot of participants level of joystick experience Figure 6.2: Boxplot of number of path collisions for the three conditions Figure 6.3: left graphs represent collisions made by users in each condition. Right graphs represent three users trajectories, randomly chosen from each condition Figure 6.4: Boxplot of number of wall collisions for the three conditions Figure 6.5: Boxplot of time spent for the three conditions Figure 6.6: Boxplot of overall users driving performance for the three conditions Figure 6.7: Boxplot of the sense of presence factors for the three conditions Figure 6.8: Boxplot of the overall sense of presence for the three conditions Figure 6.9: Subjects response to the first four questions of the comparative questionnaire viii

10 Figure 6.10: Subjects responses to questions five (comfort) and six (involvement) of the comparative questionnaire Figure 6.11: Subjects responses to question 7 (What system do you prefer?) Figure 6.12: Number of participants who experienced simulator sickness in each condition ix

11 Chapter 1 1 Introduction Since the passing of the Americans with Disabilities Act of 1990, which required all public structures to be accessible to handicapped persons [1], and which was followed by the enactment of similar laws in most countries, engineers and researchers have been motivated to develop power wheelchair (PWC) simulators [2]. The law change also increased the demand for PWC, with an estimated 23,000 PWCs purchased everyday [3]. Despite the increased number, Swan et al. suggest that the evaluation of user proficiency and the suitability of a given wheelchair is largely guesswork, and user training is limited to practice with a possibly unsuitable wheelchair [3, pp.156]. It is accepted there are always risks associated with driving a PWC, including falls, getting stuck, collisions (against obstacles or bystanders) etc. This has made safety the main issue when considering the use of PWCs [3, 4, 5]. Thus, training and assessment can be potentially unsafe and expensive, specifically for those using a PWC for the first time [6]. Abellard et al. add that, PWCs are often a costly solution as their usual drawback is to be very specific to each handicap or each person [4, pp.161]. Consequently, a PWC user needs to be assessed as to whether they are able to drive a PWC before buying one, or trained in using it after it has been purchased. Therefore, some work has been done to evaluate capabilities of driving a PWC, such as usability tests, observation and questionnaires, but this is a long and costly procedure and alternative solutions are needed. Rehabilitation engineers and researchers have begun to see Virtual Environment (VE) as a potential tool for assessing and training disabled persons [7]. Harrison et al. [6] reported that a computer simulation, controlled by a joystick, could be a better training/assessment solution to avoid the danger of collisions in real situations. Niniss et al. [7] add that VE can be considered as a useful tool to evaluate performance criteria, and Abellard et al. [4, pp.162] explain that exercising in a VE can reduce previous constraints, bring a solution to the safety problem, diversify experiments, evaluate driving capacities, and quantify needs in terms of functionalities. 1

12 Subsequently, several studies have been conducted. The two most cited reasons for developing such a system are 1) to train targeted users how to drive a PWC, and 2) to assess whether a person is suitable and eligible for a PWC. But most of the current simulators (if not all) are subject to the following factors: They are rather simple, for example, unrealistic simulation. This could potentially affect the training and/or assessment purpose because PWC users would apply what they learn in the reality. Lack in peripheral vision. A narrow field of view (FOV) would limit user ability to freely navigate through VE and reduce user s spatial awareness. This issue has also been reported in previous studies. No standard assessment simulation tool exists yet. Currently, assessment of suitability for PWC is complex, costly, and time consuming due to individual needs. This is because assessment is mainly based on observation, as there is no protocol or standard evaluation. Only one commercial system named WheelSim is available for users. Based on our heuristic evaluation of WheelSim (Chapter 3), it cannot be said to be reliable because it provides incorrect physical simulation. According to Ball et al. [8, pp.9], Peripheral vision is one of the primary factors typically reported in human interface. He states that In terms of visualization, the key benefits of exploiting peripheral vision are the greater amount of simultaneously visible information, broader contextual overview, and spatial orientation awareness. A study by Czerwinski et al. [9], which focused on the overall advantages of wider FOVs when navigating, found that wider fields of view allow better tracking of environmental information and spatial ; offloading the mental map development task to the perceptual system ; is especially effective on very large displays [9, pp.200]. Sense of presence or immersion is no less important than peripheral vision. Schubert et al. [10] note that when interacting with VEs a sense of being present in VE commonly develops. The authors distinguish between immersion and presence by explaining that Although immersion is objectively quantifiable, presence or, more precisely, the sense of presence is a subjective experience and only quantifiable by the user experiencing it [10, pp.167]. Swan et al. [3] state that the depth quality of the system images is enhanced by providing stereo visualization. Herrlich et al. [11] note that to immerse the user in the simulation, 2

13 building a realistic environment is required, meaning in a PWC simulator, the physical behaviour of the PWC in the VE should be as good as in real world. While considering current issues with existing simulators, the goal of this study is to investigate factors that may influence user driving performance in PWC simulators and compare them, in particular FOVs and display types. A 3D power wheelchair simulation was developed for this purpose. This study compared user driving performance across three different VEs, namely 1) monoscopic narrow field of view (narrow-fov), 2) monoscopic wide field of view (wide-fov), and 3) stereoscopic narrow FOV (stereo-fov). This study also compared users sense of presence and considered their preference for each of these three conditions. 1.1 Research Questions Do peripheral viewing and/or stereoscopic (3D) vision have an influence on PWC user driving performance? The answer to this question could be a basis for the development of a better clinical therapy application. 1.2 Motivation Knowing that there is no standard assessment tool to evaluate whether a person is able to drive a PWC or not has motivated us to explore this field. The long-term goal is to build a standardized assessment tool that will eliminate the need to use real PWCs to test targeted users. A wheelchair simulation review by Grant et al. concludes that: The field of driver training is one that is popular among researchers and if the goal is just to extend the user s capabilities in basic operations such as turning, stopping and obstacle avoidance then the demands on the technology are slight. Simulation not only offers the ability to train novice users in a safe environment but also gives those charged with equipping them an early insight into capabilities of the user [12, pp.108]. Because of the timeframe of this research, building a standard assessment tool was not a suitable topic for this study. Thus, we looked farther back to investigate the factors that influence users driving performance in PWC simulators. 3

14 There are many factors that could potentially affect users performance in a VE system, including but not limited to, display types, different FOVs, screen sizes, level of immersion, and user age. We started with the FOVs (narrow versus wide) and display types (monoscopic versus stereoscopic). These particular factors have been mentioned in previous studies as areas of future work. 1.3 Importance of the Study Exercising in a VE has potential advantages, since it is safe, reduces cost, enables diversified experiments, and serves as an assessment tool for driving abilities [4]. But what are the factors that influence user driving performance? This opens the research question: Does peripheral vision and/or stereoscopic (3D) viewing have an influence on PWC user driving performance over standard display (one screen)? Research conducted to study the impact of peripheral vision and physical navigation by Ball et al. [8], recommended that Understanding exactly what improves user performance can help researchers and designers focus their efforts on effective user interface design approaches, and lead towards improved theories for visualization and interaction with large displays. Therefore, this research will investigate this issue by treating standard display viewing, peripheral vision, and stereoscopic 3D viewing as independent variables to see how each one influences PWC users driving performance. This, in turn, will influence our decision on how to build a PWC simulator aimed at being a standardized assessment/training tool. The outcome of this research should not only benefit PWC simulator designers or developers but also contribute to other field of studies in terms of visualization and interaction, such as simulation systems, game design, navigation systems, computer graphics, etc. 4

15 1.4 Abbreviation and terms used Virtual Environment (VE): A synthetic, spatial (usually 3D) world seen from a first-person point of view. The view in a VE is under the real-time control of the user [13, pp.7]. Simulation: The process of designing a model of a real system and conducting experiments with this model for the purpose either of understanding the behaviour of the system or of evaluating various strategies [14, pp.7]. Performance: In this study, user performance is measured against accuracy (collisions avoidance), and efficiency (completion time). First-person Perspective: the view simulated from the point of view of a character's eyes, in this case a PWC driver's eyes. Third-person Perspective: the view simulated from a point other than first person perspective. For example, the virtual camera is slightly above and behind the character eyes. Bird s eye vision: to visualize from above. Display Field of View (DFOV): the angle subtended from the eye to the left and right edges of the display screen [9, pp.196]. Geometric Field of View (GFOV): the horizontal angle subtended from the virtual camera to the left and right sides of the viewing frustum [9, pp.196]. Cyber-sickness: a subset of the motion sickness experienced from travel through virtual environments [15]. 5

16 1.5 Thesis Outline This thesis consists of seven chapters that are structured as follows: Chapter 2 provides an overview of VE and then focuses on PWC simulators. The methodology, technology used, findings, limitations, and existing problems of these studies will assist in providing a basis for the design of this research, as well as gaining a better understanding of how these studies have been carried out. Chapter 3 discusses a heuristic evaluation of WheelSim, a PWC simulator. The methodology of the evaluation, findings, and limitations will be presented. The findings are later used to learn from possibilities and limitations in order to develop a better simulator. Chapter 4 discusses the conceptual design and implementation of the study system (PWCsim). Two phases will be discussed; the framework and its components; and the development of PWCsim system, including hardware and software. Chapter 5 describes the approach used to answer the research question. In a within-subject experiment design, quantitative data were collected to evaluate the user driving performance, sense of presence, and preference among three different conditions. This chapter includes research variables, the experiment, limitations, and assumption. Chapter 6 represents the experiment results, including users driving performance, users sense of presence, and users preference. It also provides statistical data analysis and discusses the research hypotheses in regards to these results. Chapter 7 summarizes the whole thesis project. It will discuss the conclusion made, the contribution of the study, and potential future works. 6

17 Chapter 2 2 Literature Review This research investigates the influence of different fields of view and stereoscopic vision on user-driving performance and sense of presence in a Power Wheelchair (PWC) simulator. This review will cover three areas: (a) a brief discussion of virtual environment, including presence and immersion in virtual VE, display types in VE, and interaction techniques; b) power wheelchair, including environment use, and methods of propulsion; and finally, c) power wheelchair simulators will be presented. This review is limited to power/electric wheelchair simulators; smart and manual wheelchair simulators will not be discussed. 2.1 Virtual Environment VE has many definitions depending on the complexity of the technologies used. One of the basic definitions by McLellan [16, pp.457] describes VE as a new multisensory communication technology that enables the user to interact with the data intuitively and provides new ways of involving human senses. According to Loeffler et al. [17], there are four main factors that make a virtual environment system: it is generated by computer, is three-dimensional, is rendered in real time, and is a simulated environment. Regenbrecht et al. [18], explain that the possibility to interact with the virtual world distinguishes VEs from a range of other media. However, Loeffler et al. [19, pp.27] explain that trying to trace the origins of the idea of virtual reality is like trying to trace the source of a river. It is produced by the accumulated flow of many streams of ideas, fed by many springs of inspiration. One field where the potential of VE has been explored is science fiction [16]. Other technologies, such as computer graphics and computer interfaces, have also led to the development of VE over time. There are a variety of technologies used in VE systems, including, but not limited to, stereoscopic viewing, projection based simulation, head-mounted display, and panoramic display. 7

18 2.1.1 Presence and Immersion in VE When interacting with VEs a sense of being present in the VE commonly develops [10]. Treating presence as a psychological phenomenon, Slater et al. [20], state Presence is a state of consciousness, the (psychological) sense of being in the virtual environment, and corresponding modes of behaviour. The term immersion has often been used in referring to the key factor in presence [16, 21]. Furthermore, distinguishing between immersion and presence, Schubert et al. [10] explain Although immersion is objectively quantifiable, presence or, more precisely, the sense of presence is a subjective experience and only quantifiable by the user experiencing it." In other words, immersion describes the fidelity of the VE technologies, whereas presence is the user's experience of being part of the virtual world. Because VE systems are built to give a feeling of being present when using them, measurement tools of presence are needed. Presence measurement Different tools to measure presence have been introduced, for both objective and subjective assessment. Objective measures can be physiological, such as heart rate change, or behavioural [22]. Regenbrecht et al. [23] explained that one drawback of these forms of measures is their applicability to only specific kinds of conditions. In contrast, questionnaires are the most used type of subjective measures. In recent years, various scales have been developed and used. A survey on sense-of-presence measures, conducted by Schuemie et al. [22], has concluded that for a reliable and valid presence assessment, Igroup Presence Questionnaire (IPQ) is recommended. The IPQ questionnaire, developed by Schubert et al. [10], was a result of several studies; approximately 500 participants were involved in these studies [24]. It consists of 13 questions that measure three independent variables, namely involvement, spatial presence, and realism Display Types in VE A wide range of possible display technologies is used in VE. For example, there are head-mounted displays (HMD), arm-mounted displays, boom-mounted displays, workbench, fish-tank virtual reality, panoramic display, and CAVEs 8

19 (different number of sides) [13]. The following figure (Figure 2.1) shows a picture of each display type: Head-mounted display (HMD) Arm-mounted display (AMD) Removed for copyright Boom-mounted display (BMD) Workbench Panoramic display CAVE environment Figure 2.1: Display types in VE, pictures were taken from [6] A VE system can be classified as non-immersive, immersive, or fully immersed. According to Mclellan [16], there are more than one schema of VE classification. However, Ogle [25] illustrates that the immersion can be thought of as a continuum of immersion with non-immersed on one side and fully immersed on the other side. The more advanced and complex technology used, the higher the immersion perceived. Another classification, delineated by Kjeldskov [26], is full and partial immersive displays, which largely impact through the display field of view (FOV). The author suggests that a full immersion display would provide the user with an available display FOV all of the time being in (Figure 2.2, next page), while a partial immersion display does not provide the display FOV in all directions looking at (Figure 2.3) [26]. Each of these displays has different characteristics, 9

20 as well as potentials and limitations. The details of the potentials and limitations are beyond the scope of this research, however, the visual display characteristics will be discussed, in particular field of view (FOV). Removed for copyright Figure 2.2: Full immersion category. On the left, CAVE environment (6 sides), HMD in the middle, and boom display on the right, operated by hand on stand. FOV is available in all directions [26]. Removed for copyright Figure 2.3: Partial immersion category. On the left, desktop monitor, panoramic displays in the middle, and CAVE environment (3-5 sides) on the right. FOV omitted due to the screen size [26]. Field of View In general, the field of view (FOV) is the degree of what human eyes can see. When considering the display FOV there are slightly different meanings, as there are two FOVs [9]. But before discussing these two kinds, understanding FOV in real life is essential. According to [27, 28], the maximum human FOV is about 200 horizontally and 100 vertically. These figures vary from one person to another [27]. In addition, the FOV is basically divided into three visions: one is central vision (foveal vision) where a person is looking straight ahead (providing more details of the objects being viewed such as reading and driving, about 5 ), the second is binocular vision, which is wider than central vision and is a result of the overlapping images of each eye (providing depth perception, about 140 ), and third is the peripheral vision that is around binocular vision, which does not have an immersion of depth [27, 28]. All of these different visions play a critical role in navigating surroundings. While binocular vision provides very accurate depth perception, central vision allows people to concentrate more and interpret the 10

21 shape, color, and size of the viewing objects. Peripheral vision, however, is very important for viewing movements happening outside the binocular vision. The following figure (Figure 2.4) explains the human FOV concept. Figure 2.4: Human FOV, designed by the researcher Display FOV, On the other hand, has two different FOV angles that must be taken into account [9]. The first is the angle subtended from the location of the observer s eyes to the screen edges (left and right), called the display field of view (DFOV). This angle depends on the user distance; the closer from the screen the larger the DFOV is. The second is the virtual camera angle, which is the angle subtended from the virtual camera to the sides of the image rendered on the screen, known as geometric field of view (GFOV). In game engines (e.g. Unity 3D), the standard virtual camera FOV is 60, which also depends on the screen size and can be adjusted by developers [9]. The example shown in figure 2.5 is a possible DFOV for a 16-inch wide screen, placed 24 inches from the user, with a DFOV of approximately 37 [9]. Commonly, the term FOV used in the literature is referred to as the GFOV, as is done in this research. 11

22 Figure 2.5: DFOV vs. GVOF, created by the author using Google SketchUp; the head was downloaded from Google free 3D warehouse Interaction Techniques in VE According to Bowman et al. [13] VE interaction techniques are methods for executing a single task. Common tasks that take advantage of these techniques are navigation, selection and manipulation, and system control. This section will briefly look at each task, including possible interaction techniques used to support them. Navigation In a VE system, navigation is perhaps the foremost of the tasks [13]. Navigation has two main components -- travel and wayfinding. Each of these tasks will be briefly discussed: Travel is the most common task in VE [13], therefore, choosing the right technique is very important. [13, pp.148] explains that travel techniques must be intuitive -- capable of becoming second nature to users. There are three subtasks of travel, namely exploration (path based on serendipity, very common in 3D gaming), search (user knows the target location, but no path provided), and manoeuvring (used for short and precise movements) [13]. While keeping these types in mind, there is a choice of different interaction techniques. Bowman et al. [13] have classified interaction technique into five categories as follows: 12

23 - Physical locomotion (intended for immersive VE, e.g. walking through the VE) - Steering (easy to understand, user has full control of directions) - Route-planning (path given, or point along a path) - Target-based (destination specified, e.g. map-based target/zooming ) - Manual manipulation (hand-based, e.g. manipulating virtual object with hand) Wayfinding in VE is the process of finding a path between two locations. User may use VE to navigate through complex VE tasks. Although wayfinding in VE can be supported by user-centred (human senses), and environment-centred aids (e.g. large FOV), it is harder than real life due to the variances between real and virtual environments [13]. These variances, such as physical constraints, could easily disorient the user and may often lead to simulator sickness [13]. Selection and manipulation When considering object positioning in VE, there are two phases associated with it. The first is selecting, which refers to the task of picking up specific object/s. The second phase is manipulating the selected object/s (orientation, scale, color, etc.) [13]. Each of these phases involves different types of interaction techniques depending on the task scenario, for example, techniques used to specify objects include naming, touching, and ray casting. On the other hand, manipulation techniques include virtual hand/s, flashlight, and image plan. Indeed, even in real life, there is no tool that can do every task; there is no interaction technique that can effectively perform all manipulation scenarios. Therefore, combining different techniques is suggested [13]. System control System control, as defined in [13], is the user task in which a command is issued to: 1) request the system to perform a particular function, 2) change the mode of interaction, or 3) change the system state. This can be achieved through UI elements, such as toolboxes, floating menus, checkboxes, etc. An example of system control in VE is the ability of the user to swap between different manipulating techniques through a three-dimension graphical menu or voice commands. 13

24 2.2 Power Wheelchair Power Wheelchairs (PWC) are also called electric wheelchairs (EWC), motorized wheelchairs, or electric powered, and powered by an electric motor [29]. These chairs allow for direction and speed control, usually through a joystick (Figure 2.6). They are designed for those who are unable to maneuver a manual wheelchair due to arm weakness or any other disabling conditions. Removed for copyright Figure 2.6: Power wheelchair controller, available on the market [29] Indoor vs. Outdoors Environment plays an important role in choosing which PWC best suits individual needs, such as indoor, outdoors, or both. For example, an indoor PWC is smaller in size compared with an outdoor one. This allows for better maneuverability in tight places, but less stability in outdoor environment due to the size [29]. On the other hand, outdoor PWCs have large tires to provide comfort and stability. Also, they have a longer battery life. However, other PWCs are specifically designed for special needs, for instance, sport activities, beach, or stair climbing. Consequently, an individual should be able to find a suitable PWC Method of Propulsion PWCs are categorized according to their propulsion systems. These systems allow for different characteristics of the PWC. Despite their variety of sizes, shapes, and needs, the three main basic approaches are as follows [30, 31]: 14

25 Rear-wheel drive This is the most common type of PWC, where the propulsion wheels are located in the back of the PWC and the front wheels are casters (Figure 2.7). This method provides a similar feeling to a manual wheelchair (being pushed) and should be easy to drive for users who have had experience with a manual chair. This kind of PWC is best used in an outdoor environment. It also provides a higher speed compared to other systems, while maintaining balance and stability. The disadvantage of this system is that it has poor maneuverability [30]. Front-wheel drive The propulsion wheels are at the front of the chair and the casters are located at the back (Figure 2.7). It is the slowest of the systems, but is better for maneuverability. That makes it ideal for an indoor environment, because it offers better turning capabilities in tight areas. Unlike the rear wheel system, it lacks stability, in particular when braking and driving down slopes [30]. Mid-wheel drive The centre-wheel drive system is a combination of the rear and front systems. The powered wheels are located directly in the centre, while the back and front wheels are casters (Figure 2.7). As a result of this integration, this method features the maneuverability of the front-wheel system and the stability of the rearwheel system, which makes it suitable for both indoor and outdoor environments [30]. Removed for copyright Rear-wheel drive Front-wheel drive Mid-wheel drive Figure 2.7: PWC propulsion systems 1 1 Available PWCs in the market: The scooter store. [Online]. Available: [Accessed: 17-Oct-2012]. 15

26 2.3 History and Development of PWC Simulators This section will focus on the related literature of power wheelchair simulators. The methodology, technology used, findings, limitations, and existing problem of these studies will help to provide a basis for the design of this research, as well as gaining a better understanding of how these studies have been conducted. The control system of a PWC simulation often falls into two categories -- standard interface and motion platform [12]. Thus, the reviewed simulations will be classified into these categories Standard Interface The standard interface simulations often consist of a standard PC, desktop monitor, and a joystick. This kind of simulation is easy to implement, but other technologies might add more complexity, such as projection-based simulation, or using different control mechanisms. This section will discuss others works, starting with monitor-based simulators, followed by projection-based simulators. Monitor-based According to Abellard et al. [4], Pronk and his colleagues [32] were the first to build a PWC simulator in Their goal was to improve the procedure for patients to adapt to a PWC with the help of computer simulation. In a pilot study, 16 subjects were recruited, including four clinical subjects. The movement of the PWC was represented in a bird s eye view. They concluded that such a system could be a useful tool for adaption and/or evaluation of PWC. Since their development of this idea, others have exploited it [4]. One of the earliest and subsequent works using VE to assess the ability of PWC drivers was that of Cooper et al. [33] in the early 90s. The aim was to provide a risk-free environment that would allow users to drive efficiently in order to evaluate their ability at PWC driving. The system consisted of a desktop display and a standard PC that generated a non-immersive VE, similar to [32] that involved a two-dimensional bird s-eye perspective (Figure 2.8, following page). The user s task was to drive along an ideal path. The system recorded the number of collisions with path boundaries, errors between virtual PWC trajectory and desired path, and time spent to accomplish the task. However, the authors did not 16

27 discuss the data analysis procedure and they concluded that VE could be a useful tool in assessing and/or training PWC users. Removed for copyright Figure 2.8: Cooper et al. [23] simulator. Left picture shows boundary collision detection with large dot. Right picture shows the PWC and its trajectories A few years later several studies concentrated on the domain of handicapped children [34, 35, 36, and 37]. Desbonnet et al. [34] developed a training PWC simulator. However, unlike [35, 36, and 37], the study lacked experimental support and evidence. The research conducted by Hasdai et al. [35] investigated the influence of a PWC driving simulator on disabled children s driving skills before and after training. The experiment population was divided into two groups -- children with and without experience. Twenty-two participants in total took part in this experiment. Using a real PWC, both groups were assessed for their driving skills. The inexperienced group was assessed twice, before and after training on a VE simulator, while the experienced group was only assessed once. The assessment tool for this experiment was based on a questionnaire, consisting of 12 questions with a rating scale from 1 to 4, and scored out of The score was calculated as follows: Score = 1000 time spent in seconds + number of collisions The results showed that the experienced group scored significantly better compared to the inexperienced group prior to training, with (p <.05). After training on the VE, the inexperienced group showed a significant increase (p <.005) compared with their performance before training, and the difference between the groups vanished [35]. The authors concluded that PWC simulators could aid in the evaluation and training of people with disabilities to operate a PWC in the absence of actual experience. 17

28 VEMS, a project conducted by Adelola et al. [36], aimed to provide easy solutions for children to learn how to drive an electric wheelchair (Figure 2.9). Their simulation system was based on a computer game, joystick-controlled, to motivate children. They used a rating scale to evaluate the safety and efficiency of their users. A conjoint measurement method was used to analyze the data. It was found that the simulator was of limited value due to visualization limitations and unrealistic behavior of the simulator. In addition, all three studies [34, 35, and 36] implemented an auditory effect in their system, such that a sound was generated by the computer when colliding with objects [25]. Removed for copyright Figure 2.9: VEMS simulator [36], indoor environment (hallway left and kitchen right) Inman et al. [37] have extended the potential of PWC simulators by making them accessible through web browsers. The aim of their project was to study the effect of their PWC training program Wheelchair Net on the skills of children before and after training in actual reality. The sample population was 13 clinical subjects (six males and seven females). The result showed that children s skills significantly improved after the training. In 2002, a project conducted by Harrison et al. [6] used two non-immersive VE systems to assess and train PWC drivers. They focused on user maneuverability and route-finding skills, using a simulator for each. Six inexperienced clinical subjects were involved. In real life, their performances were measured pre and post training, as well as through VE. Participants were asked to complete the following tasks in the maneuverability environment: a) Driving the wheelchair forward in a straight line for 10 metres. b) Reversing the wheelchair in a straight line for 2 meters. c) Driving the wheelchair into an enclosed space. d) Reversing the wheelchair out of the enclosed space. e) Completing a 180 turn around a stationary object. 18

29 f) Completing a slalom around a series of stationary objects. g) Stopping the wheelchair suddenly, to command. While doing so, the system recorded four variables as performance criteria. These were total time spent, distance traveled in each task, number of collisions, and number of manoeuvres. A questionnaire regarding the VE was given after completing the session. In the route-finding session participants were shown routes and then asked to retrace them in both real and virtual environments. However, none of the participants completed the whole experiment, which might have been due to the very long procedure. Consequently, the authors discussed the results of individual cases. Although the results showed that tasks learned in the VR could be transferred to real life, maneuverability tasks were harder in the simulator than in real life, in particular driving in reverse [6]. They also indicated that most participants commented on the lack of peripheral vision and spatial location in respect of the VE. A recent project, named miwe [5], concluded that PWC users driving performance in VE was equivalent to their driving performance in real life. The main purpose of this research was to compare two environments (real and virtual) by having two groups (in a between-subject design, 16 and 13 respectively) doing exactly the same tasks in the simulator and in real life. Seven tasks were modeled from the wheelchair skills test (WST) to be performed as follows: a) Driving backward 5 metres in a straight line b) Opening a door c) Moving through the doorway and closing it, in both directions d) Turning 180 within the limits of a 1.5 meter square (left and right) e) Turning 90 forward (left and right) f) Turning 90 backward (left and right) g) Moving sideways from one wall to another in a 1.5 meter square The first group (virtual-group) was asked to navigate through the simulator in given directions. Their sense of presence was then measured by a standard questionnaire developed by Igroup Presence Questionnaire (IPQ) [18]. Likewise, the other group (real-group) was asked to perform the same tasks but in a real environment, and no questionnaires were answered. Based on the joystick movements either on the real or virtual wheelchair, the authors were able to record the X (left/right), Y (forward/backwards), and user trajectories. The sense-ofpresence result showed that for all three categories (spatial presence, involvement, and experienced realism) the means were higher. In addition, the video analysis showed that participants had some difficulties with sideways maneuvering due to the lack of lateral vision. Archambault et al. [5] explained the difficulty was be- 19

30 cause of natural human field of view is more than 180, whereas the simulator is only 100 ([27] and [28] claimed human FOV is 200 ). However, as can be seen in Figure 2.10, [5] seems to adjust the lens focal length in the game engine, which leads to so-called fish eye (image distortion), and the focus area becomes smaller. This might also have affected the results of their sense-of-presence questionnaire. Removed for copyright Figure 2.10: miwe simulator, subjects drive in a first-person view [5]. In 2010, Herrlich et al. [11] demonstrated using a computer game engine to build a more realistic simulator for driving a PWC in VE (Figure 2.11). Basically, they measured and tested all the driving characteristics in a real PWC and converted them to the Unreal Engine 3 units (UR3). Although the result showed that the integrated physics simulation in game engines can be used to raise the realism of virtual PWCs, the chosen virtual model (rear-traction) did not meet all the requirements of the real model [11]. The main shortcoming of this study was the simulator being controlled by mouse, keyboard, and gamepad, none of which were used in a real PWC. Building upon [11], Browning et al. [15] indicated that the simulation realism is instantly perceived by users and has an immediate impact on their performance. Removed for copyright 20

31 Figure 2.11: In the right picture, the trigger spot turns green when approached by driver [11] Projection-based simulator Immersive VEs often support a large FOV with binocular display [38]. Technologies such as head and boom-mounted displays already have these features, but they isolate the user from the real world [38]. This limitation led to the development of the CAVE environment (CAVE Automatic Virtual Environment) at the University of Illinois, in 1991 [15]. Browning et al. [38] were the first to use the CAVE environment for rehabilitation purposes, in The CAVE environment consisted of three walls (3 3 m) surrounding the user to display the projected image coming from three rear projections. A fourth projection was placed on the top to produce the floor image. This allowed the user to freely interact with the VE while being able to see their bodies (Figure 2.12). These rear projectors produced stereo images and stereo shutter glasses were worn by the user. Figure 2.12: A screenshot of the CAVE environment video 2 In this study, the CAVE was explored by user groups using both manual and powered wheelchairs. The authors suggested that this kind of environment was the most appropriate for PWC simulation. They also argued that the CAVE had the potential for sharing the environment with more than one person. This particular feature could be effective for evaluation and/or training purposes at rehabilitation clinics because a therapist could observe their patient. Browning et al. [15] add that a projection-based system seems to reduce the simulation sickness associated with HMD, this being mainly due to the user not being isolated from the real world. 2 Video can be download from 21

32 Recently, ISIDORE ( assistance Interface for Simulation, Decision-making and Rehabilitation ) is a project, still in progress, conducted by Randria et al. [39] to help therapists to decide whether a targeted user can use a PWC. The assessment is based on the PWC driver trajectory on the simulator that is then compared with an ideal path, generated by the computer. The outcome of the system is a graph of the user trajectory and the ideal path (Figure 2.13). It seems no participants have been involved in this project, or if so the authors have not discussed it. Section 1 Left turn Section 2 Section 3 Section 4 Wall following Obstacle avoidance Door passing Removed for copyright Figure 2.13: Four examples of user trajectories where C shows the ideal path [39] Motion Platform Concerning the domain of mobile platform, a few projects have been built to simulate PWC. For example, a project named VAHM, developed by Niniss et al. [40] had three aims: helping with the conception of new mobility assistance functionalities, assisting in choosing a suitable PWC, and providing an easy learning environment. The authors claimed that the simulator is in progress and only provided a prototype picture of their simulation platform (Figure 2.14). They conclude that a future improvement of the system will be recording data from the simulator, such as the number of collisions and number and duration of stops. Removed for copyright Figure 2.14: VAHM simulation platform [40] 22

33 A similar project by [7] used motion platform and was aimed at evaluating the user s ability to drive a PWC. Their first version consisted of two wide screens (horizontal and ground view) and a mobile platform allowing six degrees of freedom (Figure 2.15). The system simulated roughness, collisions, vibrations and motion. It provided an outside environment and ideal path with yellow lines to be followed. Video records and observation were used as the evaluation method during this study. Niniss et al. [7] asserted that even though the screen sizes were wide (29-inches), the field of view (31 degrees) was considered by the participants to be insufficient. Removed for copyright Figure 2.15: First version of their driving simulator [7] Therefore, due to the lack of peripheral vision, another study was carried out by the same author to evaluate the minimum field of view essential to drive a PWC simulator [7]. They used a head tracker to measure head movement and video recording to measure eye movement (Figure 2.16). Regarding the head movements, the analysis showed that, mostly, the eye gaze remained at 45 degrees. Given that data about the head posture, the authors were able to determine the eye gaze limit of the PWC wheelchair users. They estimated that the minimum horizontal field of view was around 70, whereas the vertical was around 65. Removed for copyright Figure 2.16: Head posture evaluation while driving a PWC [7] 23

34 As a result of the previous study, the system was enhanced by replacing the display system with a hemispherical display system (Figure 2.17), which allowed for 110 degrees. Four front projectors produced four images that were then merged to create a single image. Besides the video recording and observation, they recorded quantitative data such as the number of collisions, trajectory of the virtual PWC, and motion of the platform. Consequently, the last experiment has shown that the qualitative data (observations and video recording) and quantitative data (joystick inputs and user s trajectory) have a potential advantage to identify skilled and unskilled users [7]. Removed for copyright Figure 2.17: Second version of [7] PWC simulator (hemispherical display system) A review of wheelchair simulation by [2, 12] states that although adding a mobile platform to the simulator shows a higher performance, it increases the complexity and cost of the simulator. The previous study by [7] reported that several participants experienced motion sickness. Pithon et al. [2] also found that no studies had considered the impact of a mobile platform on the trainees compared to simulation without a mobile platform. Niniss et al. [7] asserted that using a platform caused cybersickness and more investigation was needed to prevent the discomfort effects of using such technologies Commercialization Despite a long history of research in PWC simulators, only one commercial system is available on the market, named WheelSim [4]. It is one of the LifeTool company products and focuses on assistive technology; Austrian Institute of Technology GmbH. WheelSim has been designed to provide an easier learning environment in which PWC users can learn how to drive a PWC, as well as use as a diagnosis tool. A heuristic evaluation has been conducted to assess flaws in the system in order to avoid them in our system, and more information regarding 24

35 this system will be discussed in the following chapter (Chapter 3). However, Pithon et al. [2] state that limited commercial availability has resulted in limited clinical impact as well Summary of the Related Literature The following table (Table 2.1) summarizes all the particularities of the related papers discussed in this section. It includes some information about the technology used, evaluation methods and their experiment and task, if such exist. Table 2.1: Summary of the related works Project Year Goal [32] 1980 To improve the procedure for patients to adapt PWC through simulation [33] 2005 To assess the ability of PWC drivers [35] 1998 To train disabled children Technology used Display: Desktop display View Bird s eye view Display: Desktop display View Bird s-eye view Display: Desktop display View Not discussed Evaluation methods Not discussed Quantitative Number of path collisions, errors between virtual and ideal path, time spent Qualitative questionnaire Quantitative time spent in seconds number of collisions score Experiment and Task Participants: 12 walkers 4 drivers Task: Not discussed Participants: Not discussed Task: To drive along an ideal path Participants: Between subject design 11 experienced 11 unexperienced Task: Driving skills VEMS [36] Wheelchair Net [37] 2002 Allow children to learn how to drive PWC 2011 To study the effect of PWC simulator on children s skills Display: Desktop display View First person perspective Display: Desktop display, accessible through web View First person perspective Qualitative Rating scale Not discussed Not discussed Participants: 13 drivers Task: Not discussed 25

36 [6] 2002 To assess and train disabled people miwe [5] 2012 To compare user performance in real and VE [11] 2010 To evaluate the realism of driving a PWC in VE CAVE [38] ISI- DORE [39] VAHM [40] Motion platform Motion platform [7] Qualitative Video recording Quantitative number of collisions, user trajectory, platform motions Quantitative number of collisions, time spent, points Wheel- Sim [4] 1996 To connect the user to the real world 2012 to help therapists to decide whether a disabled person can use PWC from simulation 2000 To provide an easy learning environment 2005 To evaluate the user s ability to drive a PWC 2008 provide an easier learning environment and as a diagnosis tool Display: Desktop display, View First-person perspective Display: Desktop display, View First-person perspective Display: Desktop display, View First-person perspective Display: Rear projection based View Only VE surrounds the user Display: Front projection based View First-person perspective Display: Front projection based View First-person perspective Display: hemispherical display View Only VE Display: Desktop display View First-person perspective Quantitative time spent, distance traveled, number of collisions, number of manoeuvres Qualitative IPQ questionnaire Quantitative Joystick movements X and Y, trajectory Not discussed Not discussed Quantitative User trajectory compared to an ideal path Quantitative Number of collisions, duration, and number of stops. Participants: 6 inexperienced clinical subjects Task: manoeuvrability and routefinding skills Participants: Two groups 16 and 13 Task: maneuverability (7 tasks), Not discussed Not discussed Not discussed Not discussed Not discussed Not discussed 26

37 2.4 Summary and Discussion The literature review has shown that there are different domains of research being studied. The main fields are training simulators as shown in [20, 32, 35, 36, and 40], and assessment simulators [4, 6, and 7]. In general, studies that conducted experiments have shown common evaluation criteria that have been logged through the simulation. These criteria are the number of collisions either with objects or path boundaries [4, 6, 7, 35, and 40], and time spent [4, 6, 33, 35, and 40]. Two studies recorded user trajectories [39, 7] and graphically represented them, and two studies calculated scores based on these data [35, 4]. But, also, they have common problems, for example, some of them are rather simple, such as bird s-eye view [32, 33], while other simulators have incorrect physics simulation or even unrealistic environments. These kinds of shortcomings may have a large impact on the user performance, in particular training and/or assessment. Moreover, although stereoscopic viewing is a fundamental attribute of any VE, in particular navigation [3], only a few studies considered the use of stereoscopic viewing, such as the CAVE system [38]. In addition, [15, and 41] indicate that a common problem with monitor-based simulation is the absence of the sense of peripheral vision. In fact: The potentials for immersing the user in a virtual environment is often measured from the field of view" [26, p.78]. The main problem with desktop monitors is that even though the GFOV is large, the DFOV is much less than the GFOV, which depends on the screen size and the user distance from the screen. Harrison et al. [42] state that participants did not prefer other display types, such as HMD, when they had to choose between large display and HMD. However, the authors did not explain how the users preference data was collected. Based on the literature, the lack of peripheral vision was often a factor in several studies [2, 6, and 12]. The review by Pithon et al. [2, pp.3] stated An interesting approach to justify the design of a visual interface is to study the wheelchair user FOV." Thus, this research will investigate the FOVs and stereoscopic viewing in a PWC simulator from three perspectives -- user-driving performance, sense of presence, and user preference. 27

38 2.5 Research Question and Hypotheses Principle Research Question The general research question that emerged from the review of the literature is whether different FOVs and stereoscopic viewing have an influence on PWC users, in terms of driving performance and sense of presence. To examine the effects the following specific hypotheses were formed Hypotheses The literature identified the main factors that influence user driving performance, including collisions with path boundaries, collisions with walls/objects, and time spent. And to examine the influence of these factors, as well as user sense of presence, ten hypotheses were developed to answer the research question. Generally, it is expected that a wide field of view and stereoscopic viewing will have positive effects on user s driving performance and sense of presence. H 1 Number of path collisions in a monoscopic wide FOV will be lower than number of path collisions in a monoscopic narrow FOV H 2 Number of path collisions in a stereoscopic narrow FOV will be lower than number of path collisions in a monoscopic narrow FOV H 3 Number of wall collisions in a monoscopic wide FOV will be lower than number of wall collisions in a monoscopic narrow FOV H 4 Number of wall collisions in a stereoscopic narrow FOV will be lower than number of wall collisions in a monoscopic narrow FOV H 5 Time spent to perform the task in a monoscopic wide FOV will be lower than time spent to accomplish the task in a monoscopic narrow FOV H 6 Time spent to perform a task in a stereoscopic narrow FOV will be lower than time spent to accomplish the task in a monoscopic narrow FOV H 7 Performance score in a monoscopic wide FOV will be higher than performance score in a monoscopic narrow FOV H 8 Performance score in a stereoscopic narrow FOV will be higher than performance score in a monoscopic narrow FOV H 9 Sense of presence in a monoscopic wide FOV will be higher than sense of presence in a monoscopic narrow FOV H 10 Sense of presence in a stereoscopic narrow FOV will be higher than sense of presence in a monoscopic narrow FOV 28

39 Chapter 3 3 Heuristic Evaluation 3.1 Introduction The usability of any system plays an important part in its success. Poor design in a VE system, for example, will reduce user sense of presence. Thus, a usability evaluation is essential to ensure the quality of our product. The International Organization for Standardization (ISO) defines usability as the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use 3. Several methods for testing usability have been proposed or adapted for VE [43], including formative evaluation, interview/demonstration, cognitive walkthrough, heuristic evaluation, summative or comparative, and post-hoc questionnaire evaluation. Most of these methods have been initially developed for standard graphic user interface evaluation. A few of them have been adapted for VE evaluation [44, 45, 46], but the problem still remains that no reliable and specialized evaluation method for VE application yet exists [44]. Heuristic evaluation, developed by Nielsen and Molich [47], has been widely used and/or adapted due to its ease, speed, and cheapness. It provides a set of guideline heuristics that are used by expert evaluators to find flaws in the system. It is often applied at an earlier stage of the development process so that these flaws can be addressed and redesigned. However, the heuristic evaluation by Nielsen [47] is intended for desktop applications and an adapted version for VE application is needed. This chapter documents a heuristic evaluation of WheelSim, a training simulator for power wheelchair users. The analysis starts with the evaluation goals, followed by assumptions and an overview of WheelSim, and the methodology used to evaluate the system. The findings are presented in the sixth section in this chapter, and finally there is a discussion of the evaluation and its limitations. 3 The International Organization for Standardization (ISO), Reference number ISO :1998(E) 29

40 3.2 Evaluation Goals As mentioned previously in Chapter 2, WheelSim is the only available commercial tool on the market. Our goal was to use it as a benchmark and learn from its possibilities and limitations in order to develop a better simulator. Because heuristic evaluation is simple and capable of catching a high percentage of usability issues with a small number of evaluators [47], between four to five evaluators (Figure 3.1), it was considered an appropriate evaluation method. The result was a list of issues to be considered while developing the study simulator (PWCsim) and which can be used for future development. Removed for copyright Figure 3.1: The curve shows the proportion of the issues found by evaluators [47]: almost 75% of the issues were found with 5 evaluators 3.3 Assumptions The following assumptions support this evaluation. 1. The adapted heuristics are appropriate to evaluate a VE system, in particular WheelSim. 2. The five evaluators found the majority of usability flaws. 3. The evaluators had enough knowledge of usability, graphic, and user interface to conduct such evaluation. 4. The problems found by the evaluators would also have been found by real PWC users. 30

41 3.4 Overview of WheelSim WheelSim is a PWC simulator developed by the traffic department at the provincial police command of Upper Austria (Figure 3.2). It was designed 1) to help disabled persons to operate a PWC, 2) to improve road safety, and 3) to provide quantitative data that could be used for diagnostics and therapy purposes [48]. Using a standard joystick, the system can also be used as a dexterity game to motivate children. Figure 3.2: WheelSim simulator, screenshot WheelSim has four different levels of increasing difficulty, from level A to D (Figure 3.3). Level A is used to gain experience in the VE. From level B to D the trainer should follow a given task. The task is to drive between the yellow guidelines in a given direction. Touching the yellow lines indicates a minor mistake and one red dot is displayed on the screen. Driving beyond the yellow lines is a major mistake and two red dots are displayed. Total time, the time added for minor and major mistakes, and the number of minor and major mistakes are recorded for statistics and a progress report (Figure 3.4, next page). At the end, the user is given points calculated as follows: total time in seconds + total time for minor mistakes + twice the total time for major mistakes. Figure 3.3: A screenshot of WheelSim four level 31

42 Figure 3.4: A screenshot of WheelSim statistical report 3.5 Methodology A new set of heuristics (VE-heuristics) have been either derived or adapted from Nielson s heuristic [47], and extended by Sutcliffe et al. [45], and Connell [46]. Ten VE-heuristics are adapted to meet our dimensions of interests, which are navigation and orientation, field of view, sense of presence, and physics simulation. The ten VE-heuristics can be seen in Table 1 with their references. Five evaluators took part in this analysis (all were student at Otago University). Each one performed a given task individually and recorded any flaws in the system based on the VE-heuristics. (Tasks along with the VE-heuristics descriptions can be seen in Appendix A, page ). The evaluators were also asked to give each flaw a severity rating on a three-point scale (high, medium, and low), adapted from usability severity codes (Table 2). The result is a list of problems that match one or more of the VE-heuristics. In each finding, the following criteria were addressed: heuristics violated severity, usability issue, and recommendation. Figure 3.5 shows one of the evaluators while analysing WheelSim. 32

43 Figure 3.5: One of the evaluators analysing WheelSim Table 3.1: VE-heuristics # Heuristics References 1 Navigation and orientation support [44, 45] 2 Responsiveness [46] 3 Realistic feedback [44, 45] 4 Flexibility [46, 47] 5 Natural engagement [44, 45] 6 Compatibility with the user s task and domain [45, 47] 7 Perceptual clarity [46] 8 Help users recognize and recover from collision [47] 9 Sense of presence [44, 45] 10 Field of view and viewpoints [44, 45] Table 3.2: Usability severity codes 4 Severity Code Description 1 High A serious condition that impairs the operation, or continued operation, of one or more product functions and cannot be easily circumvented or avoided. The software does not prevent the user from making a serious mistake. The usability problem is frequent, persistent, and affects many users. There is a serious violation of standards. 2 Medium A non-critical, limited problem (no data lost or system failure). It does not hinder operation and can be temporarily circumvented or avoided. The problem causes users moderate confusion or irritation. 3 Low Non-critical problems or general questions about the product. There are minor inconsistencies that cause hesitation or small aesthetic issues like labels and fields that are not aligned properly. 4 Severity ratings adapted from Usability severity codes by Usability & Technical Documentation, Xerox Corporation, July

44 Heuristic Severity 3.6 Findings The evaluation identified 25 flaws. Table 3.3 has a description of and potential solution for each issue, as well as the heuristic violated number and severity rating (high = 1, medium = 2, low =3). Table 3.3: Heuristic evaluation results # Issues Responsiveness of the turning movements, especially at high speeds gives an impression of lack of control. One does not feel safe, because small changes in direction on the joystick translate to large changes in the environment The 5 speed levels have no point of,3 reference to real life speed. Is 1 the speed at which people walk? How fast is 5? There is no way to see what is behind when reversing. This is a fatal flaw in this simulation 4 3,8 5 3,5,9, ,5 3 There is no scenario in which the user can fall off their wheel chair. If disabled people use such simulations to prepare for using wheel chairs in real life, they should have some idea of what terrain and performance is within safe limits. 2 The level is restarted if the user moves onto the road. It becomes tiresome because one cannot fully explore the environment due to fear of restarting the level. 2 Chair jumps from 0 km to 5th gear! Not possible in real life. Recommendations and/or potential solutions to the problem Sensitivity to changes in direction need to be adjustable Having speed in km/h corresponding to speed levels, as well as information about which level corresponds with normal walking speed. Having a rear camera/mirror system to ensure the user has an acceptable field of view when reversing, similar to a real motorized wheelchair. The environment should allow for wheel chairs to tip over etc. if the user pushes it to the limits of its performance If the user moves onto the road, the program should not restart the level. Maybe a helper, such as a police officer, for example, should come and inform them to move back on to the pavement. This way, the user learns the limits of the environment without restarting the level at every mistake If the chair is not moving forward, make the gear jump to 1st gear. 34

45 7 2,3,5,8 8 2,3,5,6 9 3,4,5,6 2 Chair crashes into a tree at full speed and nothing happens. 2 Chair doesn t interact with road cones. 3 The joystick is the most realistic, keyboard is the easiest to use, and the mouse is terrible Did not know how to get out of Level A Show some sort of indication of a crash. Make them fall over or include some sort of animation/possibly slow down the chair. I would use the joystick as normal tool and maybe a keyboard as back up Put a button or something else in there to tell you how to get out! 11 1,8 3 Need better/more warning when not staying inside the yellow lines Delay when accessing any menu option No indication how to increase speed. Should have something flashing on the side to indicate straying beyond yellow lines and/or something else, maybe sound. Smoother transition effect, so system does not seem as though it is failing. Give a tutorial or tool tips about the in-game controls at the start screen or load screen. 14 3,1 0 1 The environment moves too fast around me when increasing speed. Making me dizzy More realistic visual response, slow it down? The virtual wheelchair s joystick does not move according to my hand movement There seems to be no difference whether I push the joystick slightly or fully forward. 17 1,1 0 2 It s difficult to look at the surroundings when you are also trying to watch the yellow lines The ideal path to follow is too long and it takes too much time to finish the task. (very boring) Visualize the hand and joystick movement. Harder push means faster driving and Softer push mean slower, as in real PWC Put some instructions on the ground because we are already watching the yellow lines on the ground. The distance should be shorter 35

46 19 5,7 20 1,2,3,5 1 The size of the wheelchair in the virtual environment is unknown which reduces the user spatial awareness. 1 The wheelchair physics simulation is incorrect. For example, when driving exactly beside a wall and trying to turn 180 degrees, it does turn, which is impossible in reality because it would be stopped by the wall Simulator responsiveness to the joystick input is incorrect. Small or hard push means the same Fisheye view is increased at the side of the screen when rotating at a high speed. The user should know the power wheelchair size in relation to the VE. Maybe use a third person view, so that when they drive in a first person view they are aware of the actual size of the chair. Correct physics simulation. Simulate the joystick inputs to match the real one. So small push means slow speed Realistic visual representation The ideal path is very simple to follow. The hardest curve is 90 degrees, but in reality POW drivers are likely to rotate 180 degrees, or even more, up to 360 degrees. 24 4, 2 The mouse and the keyboard are hard to use to move the wheelchair. Also, using them makes a big difference to what happens in the real world The point calculation system is not convenient. Users will be rewarded for making more mistakes. No indication of whether a high score means better or worse performance The ideal path should represent up to 360 degrees, and backwards as well. The joystick is the same device that is used on the real wheelchair, so the mouse should not be used at all. The points should be calculated in a way that the higher the points the better the score is. 36

47 3.7 Discussion The main goal of the heuristic evolution was to develop PWCsim against WheelSim, the only commercial PWC simulator in the market, in order to provide a better simulation or at least as good a simulation as WheelSim. Some of the problems found by the evaluators overlapped. However, from the findings it is apparent WheelSim fails to provide an accurate physical simulation. This means it is an unreliable system for use as a training or assessment tool. The recommended solutions for most issues were considered in developing PWCsim, in particular those with a high severity rating, for example, adding a mirror to the virtual wheelchair; designing the ideal path in such a way that it represents most of the movements that a PWC driver might go through; simulating the virtual joystick to display the user movement in order to increase the user's sense of presence; and correctly implementing the physical simulation, as Unity 3D provides auto physical simulation. Other findings presented this study with confounding variables. For example, Number 14 The environment moves too fast around me when increasing speed. Making me dizzy is a sickness symptom and that led us to measuring simulator sickness by means of a questionnaire. Finding Number 19, The size of the wheelchair in the virtual environment is unknown which reduces the user spatial awareness, presented us with a spatial awareness confounding variable that we controlled by making another version of PWCsim. This version allows the user to drive from a third-person perspective before running the real experiment. These findings, along with their recommendation and/or solutions, can be used by future developers to guide them towards a better simulation. 3.8 Limitations Although a majority of WheelSim problems have been identified, there may be some potential issues that were not. One of the main limitations of this evaluation was that the evaluators were walker people, which could result in bias in the severity rating, or the missing of a real problem that drivers could encounter. Further, each evaluation is a personal opinion; a high severity problem could be rated low by another evaluator and this affects the importance of each finding. 37

48 3.9 Conclusion PWC simulators already exist but a review of the literature (Chapter 2) and an evaluation of WheelSim have shown that these simulators lack important features that should be taken into account in the first instance. These features include physical simulation, field of view, realism, simulator sickness, standardisation, etc. A better simulator is needed in order to assess and/or train PWC drivers. With this in mind, the following chapter (Chapter 4) will discuss the design decisions and implementation of PWCsim and the decisions behind building the simulator. 38

49 Chapter 4 4 Conceptual Design and Implementation 4.1 Overview To answer the research question (Does peripheral vision and/or stereoscopic (3D) viewing have an influence on PWC user driving performance?) a framework was built and a PWC simulator developed. This chapter begins with analysing the system requirements, followed by a two-phase discussion of the framework and PWCsim development. In the first phase, the framework and its components are identified. In the second phase, the development and implementation tools are discussed, including hardware components and software and simulator development. The following chapter (Chapter 5) discusses the research methodology and design. 4.2 System Requirements The requirements for a PWC simulator were adapted from the findings made in the literature review, as well as being inspired by the findings of the heuristic evaluation of WheelSim. These requirements are important to ensure the quality of the PWC simulator. According to Grant et al. [12], three aspects need to be considered when designing a PWC simulator: visual simulation, physical simulation, and control system. First, the visual simulation is concerned with the display generator of the VE. The quality of the visual display has an immediate effect on user s immersion [2]. Therefore, a PWC simulator requires a graphic card that is capable of delivering complex geometrical data in order to provide a better visual experience [12]. While many systems rely on a traditional computer screen, a few attempt projection-based simulation, such as the CAVE environment. The CAVE environment makes use of the wide FOV and Grant et al. [12] note that the wide angle of view provides users with better spatial navigation because they can look around. Pithon et al. [2] add that a semi-immersive display, such as a panoramic display, offers a wide FOV and so provides a high visual immersion to the user. 39

50 In fact this is one of the fundamental requirements that our simulator must have in order to answer the research question. At least, the wide FOV must meet normal human FOV (200 ) or more, so an individual can obtain a sense of peripheral vision. Further, the graphic card must also support stereoscopic viewing to answer the other part of the research question (Does peripheral vision or/and stereoscopic (3D) viewing have an influence on PWC user driving performance?). Second, the physical simulation is how the virtual PWC interact with the VE [12]. Pithon et al. [2] state that simulation of the virtual PWC in a VE requires graphics design and accurate behaviour, for example, rotation of the PWC in real environment shows similar rotation in the virtual environment. A virtual PWC should also reflect the impact of collisions similar to those in the real world and in order to do so, the VE must include natural obstructions, such as furniture and walls [12]. Collision detection is another fundamental requirement in a PWC simulator to prevent drivers from driving through obstacles [12]. Pithon et al. [2] point out that the extremities of the virtual PWC should be represented in the user s FOV to allow for better navigation and avoidance of collisions with objects. However, not only the physics simulation needs to be considered but also the dimensional details of the virtual PWC [12]. This, therefore, would also include dimensional details for the whole environment, whether it was an interior environment or an exterior one. According to Pires et al. [50], the interior environment is more of a challenge to PWC users due to the increase in spatial awareness demands needed to navigate in narrow spaces. Based on this and another assumption that the indoor environment is more likely to be the first place that PWC users have experience with, the system must offer an indoor environment that represents the environment that will be met with in reality (in this case a house). In addition, the system must provide an ideal path that involves standard accessibility dimensions. Finally, the virtual PWC s trajectory, collision locations, and collision numbers must be logged for evaluation purposes. Third, the control system governs the interface and interaction technique between the user and the system [12]. According to [12], the standard interface should at least consist of a display device and a joystick to power the virtual PWC. The virtual PWC must accurately react to user action on the joystick. For example, in pushing the joystick to the right, the virtual PWC must turn right. The following two phases (building the framework and development and implantation tools) will discuss the process of building the whole system, while keeping these requirements in mind. 40

51 4.3 First Phase: Building the Framework In a first attempt at providing a sense of peripheral vision, three desktop monitors were used (Figure 4.1). Several problems surfaced with this approach: 1) the side screens were in the user normal FOV and that meant there was no sense of peripheral vision, 2) the DFOV versus the GFOV was much smaller unless the user sat close to the mid screen (around 10 to 15 cm), which made it impossible to interact with the system. Therefore, an alternative approach was needed. Figure 4.1: First attempt with three desktop monitors Obviously, larger screens would solve the problems and there are two main advantages in using them: 1) the gap between the DFOV and GFOV becomes small and natural, 2) the user will be able to get close enough to the mid screen so the side screens are correctly positioned to the user s lateral vision. Similar to the CAVE environment [38], but triangular shaped, a projection-based simulation was the most appropriate design for this study. Using rear projectors, a wooden frame was constructed to display the projected images. As shown in (Figure 4.2, next page), the frame consists of a roll of laminated plastic 5 that was stretched all over the frame to provide three large screens. Each screen is 100 cm wide and 80 cm high. The height of the frame was restricted to the laminated plastic height (80 cm). The main features of the frame are 1) the screen width is adjustable for both 4:3 and 16:9 aspect rations, and 2) the ability to adjust the angle between the joint screens (Figure 4.3, next page). This is particularly helpful to meet a similar angle of the virtual cameras built in PWCsim. 5 Similar to the one used in laminating paper except there is no paper inside 41

52 Figure 4.2: Laminated plastic screens Figure 4.3: Framework design 42

53 4.4 Second Phase: Development and Implementation Tools This section will discuss the development and implementation tools used to create the simulator. The tools will be used for the deployment of appropriate software and design of the system to meet the requirements mentioned previously. This section includes sub-sections starting with hardware components, followed by software components and simulation design Hardware Components The hardware used in this study was constrained by its availability to the University of Otago s Department of Information Science. For the wide-fov requirement, three identical projectors (Dell, Figure 4.4) were used to present the user s central and peripheral viewing. These projectors are natively 4:3 aspect ratios with 1024 x 768 pixel screen resolution. Using an external multi-display adapter for this purpose, the GFOV can only be extended to 225 (75 x 3), with a resolution of 3072 x 768 pixels. The Matrox adapter (TripleHead2Go DP, Figure 4.5) allows for three monitors/projectors to be connected to the computer. It makes use of the system s GPU to produce uncompressed graphics and video across the connected monitors/projectors. The result is a one-size screen. More details of the Matrox software will be discussed in the following section. This condition was run by a standard PC with a high quality graphic card (AMD Radeon). The other two conditions (narrow-fov and stereo-fov) were run by a laptop computer (Alienware, Figure 4.6, using Epson projector, Figure 4.7). Both the Alienware s graphic card and Epson projector support stereoscopic 3D viewing. Although the native resolution of the Epson projector is 1920 x 1080, the laptop s graphic card (NVIDIA) is only compatible with 1280 x 720 (16:9 aspect ratio) when running stereo 3D mode. Therefore, and for consistency reasons, the same resolution was used for the narrow-fov condition. The NVIDIA graphic card s control panel makes it easy to switch between monoscopic and stereoscopic modes. Thus, it was possible to run these two conditions from one machine. In addition to the stereo-fov condition, active shutter 3D glasses (Figure 4.8) were worn by the viewer. The glasses were wirelessly synced to the projector through an infrared Radio Frequency (RF). Two standard joysticks (Figure 4.9) were used to interact with the systems (one connected to the PC and the other to the laptop). More specifications of all devices can be seen in table 4.1 page

54 Figure 4.4: Dell Projector 2300MP Figure 4.5: Matrix Adapter Figure 4.6: Alienware laptop Figure 4.7: Epson projector EH-TW6000 Figure 4.8: Active shutter 3D glasses Figure 4.9: Joystick used 44

55 Table 4.1: Hardware specifications Hardware Specifications Alienware laptop Operating system: Windows 7 Home premium (64 bit) Processor: Intel core I i7, 2.50 GHz Memory RAM: 24.0 GB Graphic card: NVIDIA GeForce GTX580 M Output Port used: HDMI Refresh Rate: 60 Hz PC Operating system: Windows 7 Home premium (64 bit) Processor: Intel core I i7, 3.40 GHz Memory RAM: 8.0 GB Graphic card: AMD Radeon HD6450 Output Port used: Display port Refresh Rate: 60 Hz TripleHead2Go adapter DP Edition 6 Input connector: One Display Port input Output connectors: Three Display Port outputs Power: USB and Display Port for power Epson EH- TW Resolution: 1080p Native Aspect Ratio: 16:9 Input: Video: Two HDMI 3D Formats: Top-and-Bottom, Side-by-Side, Frame Packaging Dell projector Resolution: True XGA, 1024 x 768 Native Aspect Ratio: 4:3 Input: Video: VGA port 3D shutter glasses 6 Power: Battery operated Features: 1) Included Infrared RF, freely walk around 2) Battery life up to 85 hours. 3) Fit over most eyeglasses FOV: As there is no documentation on the glasses FOV, it has been manually calculated. This was done by drawing a line on the wall (30 cm) and the observer adjusted his distance while wearing the 3D shutter glasses so the line fully fits into the observer FOV. Knowing the line distance and the observer distance from the wall, it was possible to determine the subtended angle from the observer s eyes to the line edges. The FOV is approximately 120. Setup and Connections 6 Matrox TripleHead2Go. [Accessed: 01-Nov-2012]. 7 Epson Australia - EH-TW6100 Specifications. [Accessed: 02-Nov-2012]. 45

56 The following figure (Figure 4.10) shows all the hardware components used for PWCsim and how they are connected. Figure 4.10: PWCsim components architecture 8 8 Designed by the researcher. Some objects were downloaded from SketchUp free 3D warehouse such as laptop, PC, projectors, chairs, and joystick. 46

57 4.4.2 Software and Simulator development The development of PWCsim was an extension of previous work 9 (first version). After an initial analysis of the related works, and the WheelSim application heuristic evaluation, the current PWCsim was the result of either modifying/redeveloping some parts of the first version or introducing new functionalities to the system, such as path implementation and collision detection. This section will discuss the implementation techniques and design decisions made to meet the requirements in section 4.2, using the framework discussed in section 4.3 and the tools (hardware components) discussed in section Thus, the development of the system is divided into several parts: 3D modelling design, building the simulation, and configuration and setup of the simulator D models Although Unity 3D can be used to build basic 3D models, it is not a suitable tool for complex models. Thus, a 3D modelling program was necessary to design complex 3D models, such as the simulator environment (house), virtual PWC, and the ideal path. The Google SketchUp software was used for this purpose to take advantage of its free 3D warehouse models as well as its free availability. SketchUp software provides a very basic interface design. It consists of the title bar (standard windows control), menus (SketchUp tools, commands, and settings), toolbars (user-defined set of tools and controls), drawing area (3D space identified with visual axes), and status bar (contains information about the current application, tools, etc.). Models created by SketchUp software can be easily exported to a variety of formats, including 3DS, AutoCAD DWG, AutoCAD DXF, Google Earth format (.kmz), Collada (.dae), FBX, and OBJ. All models were exported to FBX as Unity can only support this type of format among the mentioned ones. This section will discuss the design and decision making of the environment and the ideal path using SketchUp software. 9 J. Collins and H. Regenbrecht, Power wheelchair simulation prototype. Internal software at HCI group, department of Information Science, University of Otago, New Zealand, 2012; available upon request (holger@infoscience.otago.ac.nz) 47

58 Environment As discussed in the requirement section, the environment used for this simulation is a domestic environment (house). It has already been built into the first version but it lacked of textures and accessibility standardization for wheeled mobility (Figure 4.11). These two flaws have been addressed by adding textures to the environment to provide a more realistic simulation and adjusting the house dimensions, in particular doors and corridors to meet standards for accessible design (ADA) [1]. That is, an effective width for internal doors accessed from corridors is 1.2 m and the corridor's minimum width is 1.5 m to facilitate 360 turning [49] (Figure 4.11). Figure 4.11: Left picture is the first version of PWC simulator. Right picture is the environment architecture after adjusting the house dimensions Ideal path Two aspects were taken into account while designing the path -- the user task and the path width. The user task should represent all possible movements that a PWC user would make. A few studies have considered the user tasks, [5] and [6], based on the wheelchair skills test (pages 18-19). These studies asked the user to perform each task separately but the problem is these tasks are not performed separately in reality. So, the idea was to create an ideal path that if the user drives along this path he/she should accomplish these separate tasks. According to ADA [1] and [49], the minimum access route width that wheelchair users can freely manoeuvre is 1.2 m, which is also the minimum distance in front of each face of the door. There are other situations where these figures can be higher or lower, such as public or commercial buildings. Figure 4.15 (next page) illustrates the ideal path and the different tasks. 48

59 Figure 4.12: Path planning 49

60 Building the simulation Introduction to Unity 3D Game engine platforms have provided proven realistic simulations for PWC, as shown in [11]. The Unity 3D game engine was used for this study, based on its availability at Otago University. In general, and similar to other game engines, Unity 3D offers a strong integrated graphics, physics, audio, and input engine to create interactive 3D content, such as real-time animation or architecture visualisations. The Unity editor facilitates a powerful interface design where a developer can import assets and build scenes, audio, lights, and physics. It is also where they can simultaneously play, test, edit, and add interaction via scripting. The Unity editor contains different windows (view) and tools that support the workflows for creating a game 10. Figure 4.13 shows these different views and a brief description of each is provided underneath the picture. Figure 4.13: The main user interface of Unity Number one (1) is the Project/Assets view that contains all files involved in a specific game project. This includes graphics, scripts, sounds, prefabs (preassembled game objects), and textures. Unity 3D provides standard assets when starting a new project and developers can add more assets to the project assets folders later. Unity 3D features integration support with other 3D modelling applications, such as 3ds Max, or Photoshop. For example, if the assets folder contains a Photoshop file, by double clicking on that file Photoshop workspace 10 UNITY: Game Development Tool. [Accessed: 05-Dec-2012]. 50

61 will be opened and any change to that file will be automatically updated to the Unity assets folder. Number two (2) is the Hierarchy that is used to add game objects to the scene by dragging them from the assets folder. It lists all objects and their hierarchies that are currently loaded to the scene (3). A game object could have subordinate objects (children) so that any change to the parent object will affect their hierarchies, for example, resizing or repositioning a game object. Number three (3) is the Unity Scene view that is used to construct the scenes. Basically, any visual assets can be placed in the scene view and later manipulated. The scene view consists of a manipulator, placed in the top right corner, which allows switching between different view perspectives of the game. By clicking on any object, a developer should be able to move, rotate, and scale that particular object freely. The scene can also be deployed to a certain platform such as PC standard, IOS system, Android, and web application. Number four (4) is the Inspector that provides a detailed view of a particular object to be tweaked and inspected. The object could be either in the scene or assets list. For example, a point light game object has a transform component that specifies the position, rotation, and scale of the object, and light component that makes the light source object behave as a light source. Number five (5) is the Game view, where a developer can run, pause, and stop running the game. When previewing the game, the scene view updates in real time, which makes it easier to debug the system. However, the game cannot be run if there are any compiler errors, for example, a script needs to be fixed. Number six (6), the open source MonoDevelop, comes with Unity 3D and provides a powerful environment for writing and/or debugging Unity scripts. Programming languages C#, Boo, and JavaScript are all supported. Script can later be attached to a particular game object to preform specific tasks. Using Unity 3D, the following sections will discuss the simulation interfaces design, interaction method, virtual PWC characteristics, and collision detection of the PWCsim. 51

62 Interface design The main purpose of the simulator was to test user driving performance by collecting specific data. Thus, it was important to not interrupt the user with any user interface elements. The only navigation method for users is a joystick; other than that the whole system is controlled by the researcher, using keyboard keys. Table 4.2 shows the navigation methods for the user and researcher. The system consists of two main scenes: 1) the home screen to enter the user name and start the simulation, or to end the application, and 2) the testing screen where the user is able to drive the virtual PWC through a standard joystick (Figure 4.14). Table 4.2: Navigation Methods User navigation method Only in the second scene through a Joystick Researcher navigation method Mouse in the first scene Keyboard keys in second scene Enter F N ESC Reload scene 2 (Testing screen) User failed (PWC fall over) New user (Go to main screen) Exit (Back to main screen) Figure 4.14: Left picture represents the first scene (main screen). Right picture repr e- sents the second scene (testing screen) Interaction techniques The interaction technique for PWCsim is navigation, in particular Travel. The user manoeuvers in a given path to reach a known target. This technique is more appropriate for the purpose of this study, because a real PWC user is more likely to perform a similar task in reality. Using a joystick, the user is able to drive the virtual PWC forward/backward and turn left/right. 52

63 Field of views As discussed in the literature review, this study distinguishes between the GFOV and DFOV. In order to increase the user sense of immersion, we attempted to match the GFOV and DFOV. Although the focal length of the virtual camera in Unity 3D was set to normal, the GFOV was affected by the viewport (aspect ratio). So, if the aspect ratio is set to 4:3 (Dell projectors) with normal focal length, the GFOV angle is 75, whereas it is 90 when using 16:9 aspect ratio (Epson projector). Unlike the study conducted by [5], where the lens focal length was adjusted to increase the GFOV (image distortion), in the wide-fov condition, three virtual cameras were stitched together (exact position but different orientations) to provide 225 GFOV without any image distortion (Figure 4.15). Having this data, and based on the screen size 50, it was possible to determine the distance between the user and the screen in order to offer a similar/close DFOV to GFOV (between 50 and 65 cm). In addition, the virtual camera/s was located according to the PWC driver s head location, which resulted in a first person perspective. Figure 4.15: Left picture shows Narrow-FOV and Stereo-FOV conditions with 90 GFOV. Right pictures shows wide-fov condition (three cameras stitched together 225, 75 each ) 53

64 PWC characteristics The main requirement of the virtual PWC was to provide a realistic physics simulation and feedback on user actions. According to the literature, the frontwheel PWC type is better at navigating an indoor environment, amongst other types. Thus, the virtual PWC used for this study is a front-propelled wheels model. There is a variety of sizes of PWCs and different speed capabilities available on the market. To model the virtual PWC, we chose one of the front-wheel PWCs available on the market (C300 Corpus 11 ) and considered its dimensions for the virtual PWC (width = 70 cm, length = 110 cm, speed limit = 4 mph). By placing the pivot point between the front wheels, unity 3D handles the dynamic interaction of the virtual PWC and will correctly respond to the user motions in the Joystick. Similar to a real PWC, pushing the joystick further in any direction increases the speed of the virtual PWC and rotates the PWC in that direction. Further, the user is also able to change the speed of the virtual PWC with four specific speeds. All this is done through a script attached to the virtual PWC that reads the user input from the joystick and moves the virtual PWC in accordance to the inputs. The analysis of the heuristic evaluation suggested adding a mirror to the virtual PWC and visualizing the joystick movements in order to increase the user sense of presence. The virtual mirror was implemented by adding another camera that provided the view behind the user s back and referenced it to the virtual mirror of the PWC, whereas the visualization of the virtual joystick was through a script attached to it that read the real joystick inputs and visualized it in the system. Collision detection As mentioned previously in the Requirements section, collision detection is an important feature that a PWC must have. It prevents the driver from going through walls and hitting other obstructions. Also, being able to detect collisions makes it possible to count the number of collisions with these obstacles as well as detecting boundary violations. In Unity 3D, in order for an object to interact in the environment it must have collider components attached to it. This collider is invisible and can be used to collide with other objects such as walls, detect collision locations (X, Z, and Y), and trigger some other events. In this simulator two 11 Power Wheelchairs. [Accessed: 05-Dec-2012]. 54

65 types of colliders were used -- a box collider and a wheel collider. The box collider was placed on the upper part of the virtual PWC (above the wheels) to prevent the virtual PWC from going through obstacles as well as detect the number of wall/furniture collisions. Unity 3D applies a reaction force automatically to the collide object, which improves the realism of the simulation. The other type is wheel collider that was placed on the virtual PWC s wheels. This was used to detect any collision with the path mesh as well as the X and Z location of the collisions. A script was written and attached to the virtual PWC to control these colliders. The script requirement was to recognise which object the collider was colliding with, count the number of collisions for both the box collider and wheel colliders, and locate the collision on the X and Z axes Configuration and setup The framework built (section 4.3), hardware used (section 4.4.1), and the design and implementation discussed in the previous section were all integrated to achieve the different version of PWCsim (narrow, wide, and stereo FOVs). The narrow-fov and stereo-fov conditions only required one output to run the experiment, whereas the wide-fov required extra software and specific configuration, because it needed three outputs. Although the TripleHead2Go adapter was used to run the wide-fov, configuration and setup was needed to configure the environment in the right way (resolution, aspect ratio, position of the screen, etc.). Thus, we used the Matrox Power Desk software, delivered with the Triple- Head2Go adapter. It provides multi-display controls to the connected screens/projectors. Some of the utilities it offers, including windows positioning and resolution, stretched displays to a single display and adjusted the physical border of the displays to meet each other (Bezel Management 12 ). A screenshot of the bezel management is shown in Figures Figure 4.16: Matrox Power Desk bezel management 12 Matrox Graphics - PowerDesk. [Accessed: 05-Dec-2012]. 55

66 Chapter 5 5 Methodology and Design This chapter describes the approach used to answer the research question (Does peripheral vision and/or stereoscopic (3D) viewing have an influence on PWC user driving performance?). In a within-subjects experiment design, quantitative data was collected to evaluate the user driving performance and sense of presence under three different conditions. This chapter includes research variables, experiment, limitations, and assumptions. 5.1 Research Variables and Design Independent and Dependent Variables The independent variable of this study is the display type levels of the PWCsim. There are three levels, namely: Narrow-FOV: this is the normal condition where only one screen is used with 95 FOV. Wide-FOV: with two additional screens to the narrow-fov condition, placed on each side of the user's peripheral vision. This allows for 225 FOV, which covers human normal FOV. Stereo Narrow-FOV: similar to the narrow-fov, but a different display type where a viewer needs to wear active shutter glasses in order to see a stereoscopic image. Each of these levels is built into a single simulator. There could have been a fourth level of the independent variable (Stereo wide-fov) but due to the unavailability of the right technology and the study timeframe, this condition was excluded from the study. Hybrid FOV is an extra condition, based on the assumption that peripheral vision is monoscopic only due to the fact that both eyes can't be directed to the sides but also excluded from the study due to the same reasons. The dependent variables of this study relate to user driving performance and sense of presence. The user driving performance is determined by time spent, 56

67 number of wall collisions, and number of path collisions. Variables are defined as follows: Time spent: measured as the time it takes to complete a given task under each condition. The least time is the most efficient condition among others. Number of wall collisions: the accuracy of completing the task. In this study, it is measured by the number of collisions with wall and/or objects. The most accurate condition is the one with the fewest wall collisions. Number of path collisions: a measure of the number of boundary violations of the ideal path. The most accurate condition is the one with fewest path collisions. Overall performance: score calculated in the following equation: Score = 1000 Number of path collisions + Twice the number of wall collisions + Total time spent in seconds. User sense of presence: the user's feeling that he/she belongs to the VE or is part of it. In this study it is measured with a standard questionnaire (IPQ). The highest average determines higher sense of presence Potentially Confounding Variables Several potentially confounding variables have been identified in this research. In order to mitigate their effect these variables have been either controlled and/or measured. These variables include study sample, demographic issues, simulator sickness, practice effect, spatial awareness, and user preference. Given the timeframe for this study, and since PWC users are the focus of PWCsim, it was almost impossible to obtain ethical approval at university level to test the application with clinical subjects. Instead, we have limited the scope of this study to those who have mobility injuries, and we have assumed that walkers (the study sample) would manage the simulator in much the same way that clinical subjects with physical disability would. Other variables associated with the sample demographic may also confound results. These include participant gender, familiarity with using a joystick, left or right hand dominance, and interpupillary distance. Demographic questions were carefully selected to measure these variables and to help with understanding any differences in the data (demographic questionnaire can be seen in Appendix B, page 117). A discussion of these variables follows: 57

68 Gender: Czerwinski et al. [9] indicate that several studies have suggested that when navigating VE, males outperform females significantly. But they also found that when a large display (36") is used, females achieve a similar level of performance as males, especially when combined with a wider FOV (60 ). To control the gender effect, large screens of 50 were built and FOVs were set wider than 60 for all conditions (90 for narrow-fov and stereo-fov conditions, 225 for wide-fov condition). Because we have used large screens and wide FOVs, gender should not affect the validity of results. Familiarity with joystick: it is more likely that younger users or frequent computer gamers may be more familiar with joystick devices, which could affect the outcome. In a similar way lack of familiarity could introduce a bias if subjects learn how to use a joystick while performing the task. Three steps were taken to control and measure this variable. First, participants were asked two questions in the demographic questionnaire about their familiarity. One question asked whether they had used a joystick before and the other asked how good they thought they were at using a joystick using a seven-point Likert-like scale question (0 to 6). Second, the joystick was clearly explained to all participants in the task description (Appendix B, page 118) and at the beginning of the experiment. Finally, they were also given a training session with no time limit to improve their skill with the joystick before the experiment was conducted. The demographic survey also included a question about the subject s dominant hand to enable placement of the joystick on the subject s preferred side prior to the test. The joystick itself could also confound results, because a standard game joystick was used in this experiment. Hence, an assumption was made that subjects with the ability to control the game joystick should also be able to control a standard PWC controller. The heuristic analysis of WheelSim application (page 35, finding 16) has showed that the responsive feedback on the joystick at all levels of force showed exactly the same speed. This is unrealistic feedback and therefore the joystick feedback responsiveness in PWCsim has been configured to show similar responsiveness to those used in real PWC. Interpupillary distance (IPD) refers to the distance between the pupils of a person's eyes [27]. The precise measurement of this is critical when it comes to stereoscopic 3D viewing, where the projected video of the viewing system needs to match the viewer's interpupillary distance. This is only required for the stereo- FOV condition. To control this variable, the depth perception was adjusted (if necessary) for each subject through the NVIDIA Control Panel before starting the 58

69 experiment. While a subject was wearing active shutter glasses, he/she was asked whether they saw slightly different images and the depth perception was adjusted until the subject was satisfied and could only see one stereoscopic picture. Another potentially confounding variable, which could dramatically change user driving performance, is simulator sickness. Previous reports have shown that users experience some sort of simulator sickness when using VE [51], in particular wide fields of view [52]. At least one of the heuristic evaluators experienced simulator sickness symptoms (dizziness) while testing the WheelSim application (page 35, finding 14). This led to the measurement of simulator sickness. In order to measure cybersickness participants were first questioned about the state of their health, as part of the demographic questionnaire. This included vision problems, disabilities, and any other health issues that could affect their performance, such as cold or flu, headache, etc. According to Johnson [52], simulator sickness symptoms seem to increase in those who are not in their usual state of health. Kennedy et al. [53] advise that such individuals should be excluded from the study sample to control for simulator sickness. Therefore, those who did not have normal vision or vision which was not corrected to normal, those who had arm or shoulder disabilities, or were feeling unwell, were excluded from participation in the experiment. After each condition, four questions, adapted from the simulator sickness questionnaire (SSQ) [53], were asked to determine if there were any simulator sickness symptoms. Spatial awareness may also confound results for PWCsim task, poor spatial awareness could result in poor driving performance and/or sense of presence. Similar to reality, in the PWCsim the simulator the user drives the virtual PWC from a first person perspective, but the problem is that in reality PWC users are aware of their PWC size, shape, and movements with regard to their surroundings. This particular problem has been reported, with high severity, in the WheelSim heuristic evaluation (page 36, finding 19 and 20). This was controlled for by introducing subjects to the virtual PWC size, shape, and dynamic movements through a different version of PWCsim, developed for this purpose. In a warm-up trial, subjects were able to switch between first and third person viewing that gave them the ability to navigate while maintaining spatial awareness of their VE. This, however, was only applicable for the training simulation. In comparing more than one condition, user preference is another factor that may bias a subject in favour of one condition, which could result in different behaviour over different conditions. A study by Harrison et al. [6] reported that users chose a large screen over a head-mounted display, even though users were 59

70 fully immersed in HMD display type. The authors did not discuss or provide documentation with regard to user preference. Therefore, we measured subjects preferences through a direct comparative questionnaire between all three conditions. We also looked at how subjects perceived their participation in terms of their driving performance and sense of presence, regardless of their real performance results (comparative questionnaire can be seen in Appendix B, page 123). A within-subject design was chosen for the comparative experiment of the display types. This design is advantageous in measuring differences between conditions since each participant takes part in each condition, but it also suffers from an order effect. Subjects could perform better in subsequent conditions because they become more practised (practice effect), or perform worse because they are tired (fatigue effect). Both these effects could potentially influence the outcome of the experiment. To counteract this, the experiment conditions have been counterbalanced into six plans (Figure 5.1). Although confounding effects exist for each plan, the effects are distributed equally in the final result. In addition, subjects were asked to do the task three times in a warm-up trial (one round from a first person perspective, second round from a third person perspective, and third round switching between first and third person perspectives). This should not only improve participants skills, but also reduce the wow effect [54] that might result from the first time using a VE system. Plan 1 Narrow-FOV Wide-FOV Stereo-FOV Plan 2 Narrow-FOV Stereo-FOV Wide-FOV Plan 3 Stereo-FOV Narrow-FOV Wide-FOV Plan 4 Stereo-FOV Wide-FOV Narrow-FOV Plan 5 Wide-FOV Narrow-FOV Stereo-FOV Plan 6 Wide-FOV Stereo-FOV Narrow-FOV Figure 5.1: Order of display types conditions 60

71 5.2 The experiment This section outlines the approach used to answer the research question. A detailed description of the study sample will be given, and task, experimental design, environment, procedure and instruments discussed. Ethical approval at departmental level was obtained for this study (Appendix B, page ). The experiment made use of PWCsim and questionnaires to arrive at an outcome for each participant Study Sample A pilot study was conducted with two participants to provide formative evaluation of the procedures and instruments. This was followed by the actual experiment. The study sample was recruited from students enrolled at Otago University. Twenty-four non-clinical subjects (21 male and 3 female) took part in this study. The average age was years, ranging from 19 to 31. No participants had vision problems nor health issues that would exclude them from participation. Seventeen subjects had prior experience with a joystick, four subjects had no experience but knew what a joystick was, and three subjects did not know what a joystick was. On scale from 0 to 6, the average of familiarity with the joystick was Twenty subjects indicated they were right-handed and two lefthanded. Two subjects claimed to be left- or right-handed depending on the task, but they preferred to use the joystick with their right hand. None of the participants had used the system before the study was conducted. Upon conclusion of all the sessions, subjects were rewarded with a ten dollar supermarket voucher Task The task scenario used in this study was an indoor environment. By following an ideal path (two black lines), subjects were expected to perform tasks similar to those in a real environment. Upon completing the task, subjects were expected to have gone through most of the possible movements that a PWC user would make in a domestic environment. Directions to be followed by subjects driving in the PWCsim were drawn on the floor (red arrows). The representation of an arrow was triggered by the driver reaching a point that was close enough to the point where they needed direction information (Figure 5.2, next page). The path had start and end points and subjects were simply asked to drive from the start to the end. The task was to drive as quickly and accurately as possible. Subjects were 61

72 informed that they could always change the speed using the joystick fire button (four speeds), and that they should avoid collisions with the path boundaries and walls. Figure 5.2: Directions Environment This study used a single room setup with the display framework and all hardware components, including projectors, PC, laptop, joysticks, and 3D glasses. The display framework was placed on top of two tables in a triangular shape. With an adjustable chair participants were able to sit between the tables and adjust the chair height and position. The tables were also used by subjects for completing questionnaires. There was enough space behind the framework for an observer to record informal information. The room was darkened to allow better visibility and the temperature was at an appropriate level to run the experiment. The room was located at the University of Otago, Annex laboratory. Figure 5.3 shows the environment from two different perspectives. Figure 5.3: Experiment environment 62

73 5.2.4 Experiment Design In a within-subject experiment design, twenty-four participants took part in this study. Each participant performed the driving task for each of the three conditions (narrow-fov, wide-fov, and stereo-fov). Condition order was separated into six plans according to the schema in Figure 5.1 (page 60). That means four subjects were randomly assigned to each of the six plans. Unlike between subjects design, where the results could be affected by the different opinions and/or performance of each of the separate groups, a within-subject design offers a direct comparison between conditions because each participant went through all conditions Study Instruments Two main instruments were used to measure users driving performance and sense of presence. The users driving performances were measured through the PWCsim, the measurement consisting of three criteria: total time spent, number of wall collisions, and number of boundary violations. These criteria were inspired by the review of related works. The user task was to use the joystick to guide the virtual PWC, following specific directions. While being able to change the speed, subjects were asked to drive as fast and as accurately as possible. A subject s trajectory was logged in a time-stamped manner at 10 HZ, while the positions of the path collisions were logged when they occurred. Both user trajectory and positions of path collisions were saved in text files for visualization purpose. Total time spent, number of path collisions and number of wall collisions were recorded upon completion of the task in another text file. Figure 5.4 (next page) shows examples of the outputs of PWCsim. 63

74 Figure 5.4: A screenshot of three files generated by PWCsim. Top file shows individual s trajectory. Mid file contains all data related to path collisions (participant number, time of occurrence, and X and Z coordinates of the collision). Bottom file contains all su b- jects performance data for specific version (in this case Stereo-FOV) The performance criteria were also used to define the overall user performance score. The point score system selected had been used in two studies [35] and [48]. Similar to [35], the score was calculated by assigning 1000 points for each subject and subtracting one point per path collision and one point per second elapsed, except that in PWCsim, two points per wall collision were also subtracted. That was because PWCsim has two kinds of collisions -- path and wall collisions. The same penalty system used in the WheelSim system [48] was applied to PWC, where the author multiplies the total time of major mistakes by two to differentiate it from the total time of minor mistakes, except in PWCsim path collisions are regarded as minor mistakes and wall collisions major mistakes. The final formula for the PWCsim point system is as follows: Score = 1000 number of path collisions + twice the number of wall collisions + total time in second 64

75 Sense of presence, on the other hand, was measured using the IPQ questionnaire. It was developed by Schubert et al. [10] to assess presence in immersive VE (discussed in chapter 2). It consists of 13 questions that measure three different factors of involvement (four questions), spatial presence (five questions), and realism (three questions). Table 5.1 shows the factors and the questions associated with them. Each question is scored on a 7-point Likert-like scale, ranging from -3 to 3. Table 5.1: IPQ questions and factors Factors General presence Involvement Questions In the computer generated world I had a sense of being there How aware were you of the real world surrounding while navigating in the virtual world? (i.e. sounds, room temperature, other people, etc.)? I was not aware of my real environment. I still paid attention to the real environment. I was completely captivated by the virtual world. Spatial Presence Somehow I felt that the virtual world surrounded me. I felt like I was just perceiving pictures. I did not feel present in the virtual space. Realism I had a sense of acting in the virtual space, rather than operating something from outside. I felt present in the virtual space. How real did the virtual world seem to you? How much did your experience in the virtual environment seem consistent with your real world experience? The virtual world seemed more realistic than the real world. In addition to the IPQ questions, four questions related to the confounding variable simulator sickness were added at the end of IPQ questionnaire. These questions were adapted from the simulator sickness questionnaire (SSQ) introduced by Kennedy et al. [53]. The SSQ canvasses 16 symptoms that define three main components: oculomotor problems, nausea, and disorientation. The total SSQ score is the combination of these three components. Basically, subjects report their experience as one of none, slight, moderate, and severe for each of the symptoms that are respectively scored as 0, 1, 2, and 3. To calculate the score for each of the three components, the reported value is multiplied by the components weight and summed to produce the final result. Because the focus of this study is not about simulator sickness, only four symptoms out of 16 have been chosen. 65

76 These four symptoms cover the three components: oculomotor problems, nausea, and disorientation. Table 5.2 shows the components, their weight, and score calculation (IPQ questionnaire and simulator sickness questions can be seen in Appendix B, page ). Immediately after each experiment, subjects were asked to answer the IPQ and simulator sickness questions by circling the appropriate number that described their experience in that particular environment. Table 5.2: The calculations in the SSQ None = 0, Slight = 1, Moderate = 2, Severe = 3 Weight SSQ Symptom Nausea Oculomotor Disorientation General discomfort Difficulty concentrating Dizzy (eyes open) Dizzy (eyes closed) Total [a] [b] [c] Score for each column [a] x 9.54 [b] x 7.58 [c] x Total Score = ([a] + [b] + [c]) x 3.74 Other questionnaires A demographic questionnaire (Appendix B, 117) was applied at the beginning of the experiment. It was used to collect participant information, including age, gender, preferred hand, joystick experience, and health issues such as vision and disability. This data could be used to control and/or measure for the confounding variables and could also be used later to interpret participants' performance and behaviour during the experiment. A self-report sheet (Appendix B, page 119) was used to record participants' performance data for each experiment in case data was not recorded by the system or data was lost. It also was composed of five questions scored on a 10-point Likert-like scale. These questions were filled in by the researcher while participants performed the task, in order to better understand the current system and for possible later use to improve the usability of the system. Upon completion of the study, a comparative questionnaire was used to directly compare the three conditions from the participants' perspective, regardless of their real performance data. In each question, participants were asked to choose one of the three conditions about how they perceived their experience. The comparative questionnaire was composed of seven questions. Each question had three 66

77 answers covering narrow-fov, wide-fov and stereo-fov. Four questions were asked about user driving performance, including overall performance, wall collisions, path collisions and time spent. The other three questions asked about participant comfort, involvement, and preference (questionnaire can be seen in Appendix B, page123) Procedure and Data Collection Before conducting the experiment with any participant, the researcher went through a checklist to make sure the whole environment was correctly set up. This included the order of conditions, seeing that all questionnaires and paper sheets were available and in the right place, room temperature, etc. (checklist and conditions order can be seen in Appendix B, page ). Upon arrival, a participant was welcomed and asked to sit down and make him/herself comfortable. The participant was informed of the purpose of the study (investigating the influence of peripheral and stereoscopic vision on user driving performance in a PWC simulator) through a participant information sheet (Appendix B, page 115). After the introduction, the participant was asked to read the consent form and sign it before proceeding further. Next, the participant was asked to complete the demographic questionnaire. The participant was then asked to read the task description (Appendix B, page 118). While doing so, the information provided in the demographic questionnaire was used to customise the joystick setup (left or right hand). Once the participant had read the task description they were given an opportunity to ask any question regarding the task and experiment. The participant was also informed that the light would be switched off while doing the experiment and switched on upon the completion of each condition to enable filling in of the questionnaires. The participant was asked then to move to the workstation and prepare for the experiment. The participant started with a training session followed by the actual experiment (three sessions). A discussion of these sessions follows: Training session Each subject received instruction on how to use the joystick and was given unlimited time to practise in a training simulator. This simulator allowed them to switch between first and third person perspectives. The environment used for this simulator was slightly different from the actual simulator. There were no textures and roof for the house so the participant could still see the virtual PWC while 67

78 driving from a third person perspective. Once the participant was satisfied with their performance, they were asked to do the task three times (three rounds), as they would do it in the real experiment. In the first round, the subject was asked to drive in a first person perspective, in the second round in a third person perspective, and the third round switching between first and third person perspectives. Once they had finished, they were asked if they had any questions and informed they would start the real experiment. Actual experiment sessions Before the first visualization was displayed participants were informed about the beforehand randomized order of conditions (e.g. that they would start with the narrow-fov, then proceed to the wide-fov, and lastly the Stereo-FOV). They were also reminded again of their task, to drive as fast and accurately as possible and asked to position themselves in a predefined optimal position. As discussed previously in Chapter 4, an Epson projector was used to run the narrow-fov and stereo-fov conditions through a laptop computer. The wide-fov was run by three identical Dell projectors through a standard PC. Therefore, it was necessary to switch between computers and projectors. In the first session (e.g. narrow-fov), the researcher started the simulation by entering the participant s number and the participant was asked to start when they were ready. Once they completed the task, the researcher turned the light on and the participant was asked to complete the IPQ questionnaire, which also contained simulator sickness questions. While the participant was answering the IPQ questions, the Epson projector was put in sleep mode in order to setup the environment for the second session (e.g. wide-fov). In this condition, three Dell projectors were used. When the participant had completed the questionnaire, the light was turned off and the participant was asked to reposition themselves until they could see a straight line. This line was added to the wide-fov condition in the menu screen, which went across all three screens. Sitting at the right distance and height, the participant should have seen a straight line and that meant images in their peripheral vision were correctly positioned (Figure 5.5). Figure 5.5: Wide-FOV menu screen 68

79 The researcher entered the participant s number and the same procedure as in session one was repeated. When the participant had finished the task, the light was turned on and the Dell projectors put into sleep mode. While the participant was answering the IPQ questionnaire, the researcher set up the environment for the third session (e.g. stereo-fov). This was done by preparing the active shutter 3D glasses, enabling 3D stereoscopic viewing through NAVIDIA control panel, and switching the Epson projector mode to normal. Upon completion of the IPQ questions, the participant was asked to wear the 3D glasses and also asked if he/she comfortably perceived stereoscopic viewing. The depth perception was adjusted for those who were not comfortable with the depth provided (only three out of 24 subjects asked for depth adjustment). The same procedure as that used for session one and two was then applied to session three. Once the participant had completed the third IPQ questionnaire they were asked to complete the comparative questionnaire. Observation and self-report sheets were filled out by the researcher while the participant performed all three conditions. The participant s feedback was gathered at the end of the experiment. The experiment was concluded by informing the participant of their result and the participant was thanked for their participation and rewarded with a supermarket voucher. Figure 5.6 shows one of the participants performing the experiment. Figure 5.6: Participant doing the wide-fov condition 69

80 5.3 Limitations The hardware and software used in this study was constrained by their availability to the Otago University, Information Science department. The implementation and design of PWCsim was also constrained by the hardware used in this study. For example, the field of view was restricted to 90 for the narrow-fov and stereo-fov conditions and 225 for the wide-fov condition. The hardware also limited the number of possible conditions, from five to only three (mono- FOV, stereo-fov, and wide-fov). The two conditions excluded from the study are stereo wide-fov and hybrid-fov (stereo viewing in the centre and mono viewing in the periphery). Other reasons for not including stereo wide-fov condition were 1) the human eye does not perceive stereoscopic vision at the periphery, and 2) the active shutter glasses would limit the user field of view. Due to the researcher s qualifications, time constraints, and ethical issues, only non-clinical subjects were involved in this study. Quantitative data was used to measure the subjects driving performance and sense of presence. The subjects driving performance was measured through the PWCsim, whereas sense of presence was measured using an IPQ questionnaire. 5.4 Assumptions For the study to obtain validity these assumptions were made: That the physics simulation of the virtual PWC is accurate enough because the Unity 3D engine provides real word physics simulation. That the task chosen for this study (the ideal path) and interaction technique (travel) should be similar to the tasks performed by real PWC users in reality and which are necessary for manoeuvring and conducting daily life activities. That the study instruments (PWCsim and questionnaires) are suitable for measuring user driving performance and sense of presence and the task is generalizable. That the participants are familiar with technology and have had some experience with computer games. That the responsiveness of the standard game joystick will show similar responsiveness to those used in real PWCs. 70

81 Chapter 6 6 Results and Data Analysis 6.1 Overview The purpose of this study has been to investigate the influence of peripheral and stereoscopic vision on user driving performance and sense of presence in a PWC simulator. This has been accomplished by comparing user driving performance and sense of presence across three different conditions (display types), these being narrow-fov (90 ), stereo-fov (90 ), and wide-fov (225 ). A framework was built and a power wheelchair simulator (PWCsim) was developed for this purpose. Three different criteria were used in the user driving performance equation: number of path collisions, number of wall collisions, and time spent in seconds. The equation used to calculate the final score for user driving performance was Score = Number of path collisions + Twice the number of wall collisions + Total Time spent in second Sense of presence, on the other hand, was measured using a standard questionnaire (IPQ). The IPQ questionnaire consists of 13 questions that measure four factors, including involvement, spatial presence, and realism. This study also looked at users preferences using a comparative questionnaire (seven questions). In a controlled environment, each participant was asked to drive a virtual PWC in each condition. While doing so, path collisions, wall collisions, time spent, trajectory and path collision locations were logged in a text file. After each experiment, participants completed the IPQ questionnaire, while the comparative questionnaire was completed at the end of all sessions. This chapter presents the results of this experiment, including participants data, driving performance results, sense of presence results, perceived comparison results and simulator sickness results, and concludes with a discussion of these results. After completing the experiment data analysis was performed using SPSS Version 20. Several statistical analyses were conducted during the analysis phase, including normality testing, repeated measure One-Way ANOVA, post hoc test, and paired-samples t-test. All significance testing was performed at the 95% con- 71

82 fidence interval. An assumption that data is normally distributed is often required when using parametric tests. Different approaches are available to assess normality, including a numerical method that provides a statistical test and a graphical method that represent the distribution of the data [55]. In SPSS the explore option under the descriptive statistics menu offers both methods. Hence, both have been used to discuss the normality of the study data. The numerical statistic test is achieved by means of the Kolmogorov-Smirnov and Shapiro-Wilk tests. That is, if the Sig value is more than 0.05, the data is normally distributed [55]. Graphs can be represented in different formats, including histograms, boxplots and dot plots. In this study, histograms and boxplots have been used. To claim that the data is normally distributed, the histogram should be a bell shape that tapers out towards the sides [55]. Because each subject took part in all conditions of the experiment, a One-Way repeated measure ANOVA was performed to test significance. Field [56] has pointed out that in a repeated measure ANOVA design, data from different conditions is correlated that should not be, and therefore, an additional assumption is required. This assumption, known as the assumption of sphericity, means that the relationship between pairs of groups is similar [56]. When conducting a repeated measure ANOVA test, SPSS uses a test named Mauchly s test, which tests the sphericity hypothesis, that is the variances in the differences between conditions is equal [56]. If the result of the Mauchly s test is insignificant (p >.05), it means that the sphericity is met and we can use the sphericity assumed result. If the result is significant, then the sphericity is violated and the degree of freedom is multiplied by one of the three estimate measures (Greenhouse-Geisser, Huynh-Feldt, and Lower-Bound estimates) to correct for the effect of sphericity, and one of these estimated results should be used [56]. SPSS does this automatically. If the repeated measure design ANOVA was significant, a post hoc test was performed using Bonferroni to see if the difference between conditions one and two, two and three, or one and three was statistically significant. However, the post hoc test (Bonferroni) did not provide the t-value. Therefore, a further paired-samples t-test needed to be conducted. The data for all 24 participants was used in this analysis. The display format chosen for the data representation is boxplots, bar charts, and pie charts. With the help of boxplots, data can be comparatively visualized in terms of maximum and minimum values, the interquartile range, median, distribution, as well as the outliers. 72

83 6.2 Participants Table 6.1 shows the information collected in the demographic questionnaire. Most subjects (70.8%) were reasonably familiar with the use of joysticks and tended to play computer games. Their level of joystick experience was, however, lower than the median (3), with a 2.25 mean (Figure 6.1). None of the participants had vision problems or health issues and the average age was years. Seventeen subjects had prior experience with the joystick, four subjects had no experience but knew what a joystick was, and three subjects did not know what a joystick was. Twenty subjects indicated they were right-handed and only two were left-handed. All participants managed to complete the navigation tasks and no failure was registered. Table 6.1: Participants information collected via the demographic questionnaire Participants Preferred hand Joystick experience Vision Health state Frequency Percent Males % Females Left Hand 2 8.3% Right Hand % Left or Right Hand depending on task 2 8.3% Left and Right Hand equally for all tasks 0 0% Neither - low level of dexterity 0 0% Answer with No Answer with Yes Answer with No, but know what it is all about Normal or corrected to normal % Not normal 0 0% Healthy % Health issues (disability, flu ) 0 0% Figure 6.1: A boxplot of participants level of joystick experience 73

84 6.3 Results In this section, the result of the users driving performance, sense of presence, perceived comparison and simulator sickness will be discussed separately. For each of these criteria the normality test, graphical representation of the quantitative data, statistical analysis and interpretation of the numerical and graphical data will be represented Users Driving Performance User driving performance, as mentioned before, is based on three factors: number of path collisions, number of wall collisions, and time spent to accomplish the task. The data for this was gathered through the simulator and the outputs can be seen in Appendix C, page 125. The results of each of these factors, including performance score, are discussed as follows: Path Collisions The normality tests (Kolmogorov-Smirnov and Shapiro-Wilk) of the path collisions data indicates that the data is normally distributed with a p-value larger than.05, for all conditions. The values of the skewness and kurtosis are also close to zero (Descriptive statistics can be seen in Appendix C, page 126). The following table (Table 6.2) shows the normality test results, as well as a histogram of the distribution for each condition. Number of Path Collisions Table 6.2: Tests of normality for path collisions data Conditions Kolmogorov-Smirnov a Shapiro-Wilk Statistic df Sig. Statistic df Sig. Narrow-FOV * Stereo-FOV * Wide-FOV Histograms Narrow-FOV Stereo-FOV Wide-FOV Number of Path Collisions Number of Path Collisions Number of Path Collisions 74

85 Figure 6.2: Boxplot of number of path collisions for the three conditions As shown in the above plot (Figure 6.2), subjects had the highest number of path collisions in the narrow-fov, at 41 collisions, followed by stereo-fov with 38 and wide-fov with 31. Although the range of the data was 38 in the stereo and narrow FOVs, the interquartile range was the highest in the stereo-fov (19). The means of these conditions were close to each other; however, the narrow and stereo conditions means were lower than their median in contrast to the wide- FOV, where the mean was higher than its median (Table 6.3). In total, the subjects had 412 collisions in the wide-fov condition, 444 collisions in the stereo- FOV, and 487 in the narrow-fov. In the graphs on the next page (Figure 6.3) all of the collisions made by subjects for each condition are represented in red marks as well as their trajectories. Further, the repeated measure ANOVA did not show any significant results between the means F (2, 46) = 1.22, p =.31, therefore, no post hoc test was performed (statistical result can be seen in appendix C, page 130). Table 6.3: Descriptive statistics of path collisions for the three conditions Narrow-FOV Stereo-FOV Wide-FOV Mean Std. Deviation Median

86 Figure 6.3: Left graphs represent collisions made by users in each condition. Right graphs represent three users trajectories, randomly chosen from each condition. 76

87 Wall Collisions The normality tests (Table 6.4) of the wall collisions data indicates that the data is not normally distributed with a p-value less than.05, for almost all conditions. However, the Kolmogorov-Smirnov shows normal distribution for only one dataset and that is stereo-fov. Descriptive statistics can be seen in appendix C, page 127) Number of Wall Collisions Table 6.4: Tests of normality for wall collisions data Conditions Kolmogorov-Smirnov a Shapiro-Wilk Statistic df Sig. Statistic df Sig. Narrow-FOV Stereo-FOV Wide-FOV Histograms Narrow-FOV Stereo-FOV Wide-FOV Number of Wall Collisions Number of Wall Collisions Number of Wall Collisions The following plot (Figure 6.4) shows distinguishable differences between the conditions data in terms of range, maximum and interquartile range, but all have the same minimum values (0). Also, the narrow-fov data has an outlier. Figure 6.4: Boxplot of number of wall collisions for the three conditions 77

88 The highest number of wall collisions was made in the narrow-fov, followed by stereo-fov and wide-fov (9, 6, 5, respectively). The wide-fov has the smallest variance, whereas the narrow-fov shows the biggest. Surprisingly, the stereo-fov mean was the highest (M= 2.63, SD= 2.04) in comparison to narrow- FOV (M= 2.96, SD= 2.51) and wide-fov (M= 1.96, SD= 1.43). All the means are above the medians (Table 6.5). The distribution of the narrow-fov and wide- FOV seems to be skewed, which means that an unbalanced number of path collisions were counted among participants. Similar to the path collisions data, no significant results were found between the means F (2, 46) = 1.86, p =.17 and therefore no further post hoc test was performed. Statistical result can be seen in appendix C, page 131). Table 6.5: Descriptive statistics of wall collisions for the three conditions Narrow-FOV Stereo-FOV Wide-FOV Mean Std. Deviation Median

89 Time Spent The Kolmogorov-Smirnov normality test shows that only the wide-fov data is normally distributed, but the Shapiro-Wilk test shows the opposite. For other conditions, both tests indicate that data is not normally distributed (Table 6.6). Table 6.6: Tests of normality for wall collisions data Conditions Kolmogorov-Smirnov a Shapiro-Wilk Statistic df Sig. Statistic df Sig. Time Spent Narrow-FOV Stereo-FOV Wide-FOV Histograms Narrow-FOV Stereo-FOV Wide-FOV Time Spent Time Spent Time Spent When looking at the time spent by subject during the experiment, all conditions have some outliers (Figure 6.5). Also, the minimums of all conditions are very close to each other (narrow-fov= 30.82, stereo-fov= 34.78, wide-fov= 32.58). Figure 6.5: Boxplot of time spent for the three conditions 79

90 The stereo-fov shows the biggest maximum at seconds and the distribution is quite skewed to the bottom. In contrast, the narrow and wide FOVs show smaller maximums at and 83.12, respectively. Also, they show quite similar variances when compared to the stereo-fov, and appear to be spread out in the same manner. Obviously, subjects needed less time to navigate through the environment with the wide-fov (M= 51.55, SD= 11.70), followed by the narrow- FOV (M= 55.53, SD= 18.26) and stereo-fov (M= 58.67, SD= 20.78). Note that all means are above their medians as shown in Table 6.7. The statistical test, repeated measure ANOVA, was conducted to assess whether there was a difference between the means. Mauchly s test indicated that the assumption of sphericity had been met, X 2 (2) = 5.08, p =.08, therefore degrees of freedom were corrected using the sphericity assumed. The result showed that the display type had a significant effect on the time taken to complete the task F (2, 46) = 5.25, p=.009 but a post hoc test was needed to identify where. The post hoc test was performed using the Bonferroni method. A significant result was found between the means of the wide-fov and stereo-fov (p =.034). Because the Bonferroni test does not provide the t-value, a further test (pairedsamples t-test) was performed. As shown in Table 6.7, the results suggest that 1) there was a significant difference in the means for the narrow-fov (M=55.33, SD=18.26) and wide-fov (M=51.55, SD=11.70) conditions; t(23)=2.35, p =.028; and 2) there was a significant difference in the means for the stereo-fov (M=58.67, SD=20.78) and wide-fov (M=51.55, SD=11.70) conditions; t(23)=2.75, p =.011. All statistical results can be seen in Appendix C, page 132. Table 6.7: Time spent Paired-Samples Test Mean Paired Differences t df Sig. (2- Std. Std. Error 95% Confidence tailed) Deviatioence Mean Interval of the Differ- Lower Upper Pair 1 Narrow time-spent Wide time-spent Pair 2 Narrow time-spent Stereo time-spent Pair 3 Wide time-spent Stereo time-spent

91 Overall Driving Performance As mentioned previously, the overall driving performance score was measured through the following equation: Score = Number of path collisions + Twice the number of wall collisions + Total time spent in seconds. That is, the higher the score the better the performance. Both the normality test and the histograms (Table 6.8) show normally distributed data for all conditions. Note that the Shapiro-Wilk test indicates that wide-fov data was not normally distributed (p=.047), but is very close to (.05). Table 6.8: Tests of normality for wall collisions data Conditions Kolmogorov-Smirnov a Shapiro-Wilk Statistic Df Sig. Statistic df Sig. Performance Score Narrow-FOV * Stereo-FOV Wide-FOV Histograms Narrow-FOV Stereo-FOV Wide-FOV Performance Score Performance Score Performance Score The driving performance scores are visualised in the following plots (Figure 6.6). Generally, the scores and interquartile ranges are noticeably varied across the three conditions. Also, the wide-fov dataset has a mild outlier. Figure 6.6: Boxplot of overall users driving performance for the three conditions 81

92 The range of the stereo-fov scores was the largest, from 845 to 961, followed by the narrow-fov, from 867 to 963 and the wide-fov, from 881 to 849. While the narrow-fov shows symmetrical distribution, where the box is centred between the whiskers, the stereo and wide FOVs were slightly shifted to the top. For all conditions, the means were less than their medians. However, the subjects achieved higher scores with the wide-fov (M= , SD= 16.93). The means of the narrow and stereo FOVs were quite similar (M= , , respectively), but the deviation was much smaller in the narrow-fov (SD = 22.83, 27.75). The results of the statistical test, repeated measure ANOVA, shows that the assumption of the sphericity of the Mauchly s test had been met, X 2 (2) = 4.54, p =.103, therefore degrees of freedom were corrected using the sphericity assumed. The result indicates that the display type had a significant effect on the overall users driving performance F (2, 46) = 4.49, p =.017, but post hoc tests were needed to identify where. The Bonferroni test found a significant result between the means of the wide- FOV and narrow-fov (p =.013), however, a further test (paired-samples t-test) was performed. As shown in Table 6.9, the result suggests that, and similar to the Bonferroni test, 1) there was a significant difference in the means for the narrow- FOV (M=918.25, SD=22.83) and wide-fov (M=927.12, SD=16.93) conditions; t(23)=-3.18, p =.004, but also identified that 2) there was a significant difference in the means for the wide-fov and stereo-fov (M=917.58, SD=27.75) conditions; t(23)=2.3, p =.031. The statistical result can be seen in appendix C, page 133. Table 6.9: Users driving performance Paired-Samples Test Mean Paired Differences t df Sig. (2- Std. Std. Error 95% Confidence Interval tailed) Deviation Mean of the Difference Lower Upper Pair 1 Narrow-Score Wide-Score Pair 2 Narrow-Score Stereo-Score Pair 3 Wide-Score Stereo-Score

93 6.3.2 User Sense of Presence After each condition, the users sense of presence was measured through a standard questionnaire (IPQ). It consisted of 13 questions that measured three specific factors, namely involvement (four questions), spatial presence (five questions), and realism (three questions). Also, an additional question assessed general presence (Table 5.1, page 65). The answer to each question was scored on a 7- point Likert-like scale, ranging from -3 to 3. For each factor, the average of all answers associated with that factor was used to measure an individual factor. Further, the average of answers across all factors was then used to measure the overall sense of presence. Although the overall sense of presence was the only concern for this study, statistical analysis of the factors was performed to elicit greater detail. In this section the results of these factors and the overall sense of presence will be presented. Answers for each question can be seen in Appendix C, page 134. Sense of presence factors The normality test in the following table (6.10) suggests that data distribution is almost normal for the involvement, spatial presence and realism factors, except that Kolmogorov-Smirnov shows abnormal distribution for the stereo-fov in the realism factor (p <.05). General presence data, on the other hand, is indicated as not normally distributed for all conditions. Descriptive statistics for each factor can be seen in Appendix C, page Table 6.10: Tests of normality for sense of presence factors Kolmogorov-Smirnov a Shapiro-Wilk Conditions name Statistic df Sig. Statistic df Sig. Involvement Narrow * Stereo * Wide Spatial Presence Narrow * Stereo Wide * Realism Narrow * Stereo Wide General Presence Narrow Stereo Wide

94 Figure 6.7: Boxplot of the sense of presence factors for the three conditions The above plot (Figure 6.7) shows subjects responses to the IPQ questionnaire. The average of the responses to the involvement questions (4 questions) are shown in blue box for each condition. The assessments used to measure involvement are 1) How aware were you of the real world surrounding while navigating in the virtual world?, 2) I was not aware of my real environment, 3) I still paid attention to the real environment, and 4) I was completely captivated by the virtual world. The box plots show that subjects were more involved with the wide-fov (M = 0.91, SD= 1.36), followed by the stereo-fov (M= 0.38, SD= 1.31), and the narrow-fov (M= -0.28, SD= 1.13). The means of the wide and stereo FOVs are above the midpoint (0) whereas the narrow-fov is below. The repeated measure analysis test shows that there is a significant difference between the means of the involvement in each condition F (2, 46) = 7.75, p =.001. The Bonferroni and paired-samples test shows that 1) there was a significant difference in the means for the narrow-fov and wide-fov conditions, t(23)=-3.91, p=.001; 2) there was a significant difference in the means for the narrow-fov and stereo-fov conditions, t(23)=-2.41, p =.025. However, there was no significant difference between the wide and stereo FOVs. Note that the paired-samples test result was used here and in the following sections (Table 6.11, next page). Statistical results can be seen in Appendix C, page

95 Table 6.11: Users involvement Paired-Samples Test Paired Differences t df Sig. (2- Mean Std. Deviation Std. Error 95% Confidence Interval of the Difference tailed) Mean Lower Upper Pair 1 Narrow-Involvement Stereo-Involvement Pair 2 Narrow-Involvement Wide-Involvement Pair 3 Stereo-Involvement Wide-Involvement The green boxes in the previous graph represent subjects responses to the spatial presence questions (five questions). The ratings used to measure spatial presence are 1) Somehow I felt that the virtual world surrounded me, 2) I felt like I was just perceiving pictures, 3) I did not feel present in the virtual space, 4) I had a sense of acting in the virtual space, rather than operating something from outside, and 5) I felt present in the virtual space. It can be clearly seen that the wide and stereo FOVs increased subjects spatial presence in comparison to the narrow-fov. With all conditions having quite similar interquartile ranges, the wide-fov has the smallest range at 2.8, followed by the narrow-fov at 4.6, and stereo-fov at 5, which also has an outlier. The narrow-fov has a rating mean below the midpoint at -.04 and largest divination at 1.29, whereas, the stereo FOV (M=.82, SD= 1.13) and the wide FOV (M= 1.6, SD=.82) are above the midpoint. The repeated measure analysis test shows that there was a significant difference between the means of the spatial presence in each condition F (2, 46) = 17.8, p <.001. The Bonferroni and paired-samples test (Table 6.12, t-test) shows that 1) there was a significant difference in the means for the narrow and wide FOVs conditions, t(23)=-5.48, p <.001; 2) there was a significant difference in the means for the narrow and stereo FOVs conditions, t(23)=-3.16, p =.004; and 3) there was a significant difference in the means for the stereo and wide FOVs conditions, t(23)=-3.11, p =.005. Statistical results can be seen in Appendix C, page 141. Table 6.12: Users Spatial Presence Paired-Samples Test Paired Differences t df Sig. (2- Mean Std. Deviation Std. Error 95% Confidence Interval of the Difference tailed) Mean Lower Upper Pair 1 Narrow-Spatial Presence Stereo-Spatial Presence Pair 2 Narrow-Spatial Presence Wide-Spatial Presence Pair 3 Stereo-Spatial Presence Wide-Spatial Presence

96 The realism data is visualized in the brown boxes and is a result of the average of three questions: 1) How real did the virtual world seem to you?, 2) How much did your experience in the virtual environment seem consistent with your real world experience?, and 3) Did the virtual world seem more realistic than the real world? When comparing realism with other factors, the means for the three conditions are the lowest, with the wide-fov being the highest (M=.33, SD= 1.03), followed by the stereo-fov (M= -.17, SD= 1.20), and the narrow-fov (M= -.83, SD= 1.21). By comparing the realism factor for each condition, we can see that almost 75% of the subjects estimated the realism of the narrow-fov below the midpoint, in contrast to 50% in the stereo-fov and almost 40% in the wide- FOV. The analysis test (repeated measure ANOVA) shows that there is a significant difference between the means F (2, 46) = 8.85, p <.001, and further analysis (Bonferroni and paired-samples test) was undertaken to find where. The Bonferroni test showed that there was a significant difference between the narrow and wide FOVs with p <.001. However, the paired-samples test (Table 6.13) also showed a significant difference between the narrow and stereo FOV conditions, t(23)=-2.21, p =.038. All significance testing can be seen in Appendix C, page 142. Table 6.13: Realism Paired-Samples Test Paired Differences t df Sig. (2- Mean Std. Deviation Std. Error Mean 95% Confidence Interval of the Difference tailed) Lower Upper Pair 1 Narrow-Realism Stereo-Realism Pair 2 Wide-Realism Stereo-Realism Pair 3 Wide-Realism Narrow-Realism The yellow boxes in the previous graph (page 84) represent the answers to the additional questions in the IPQ questionnaire that measure general presence (in the computer-generated world I had a sense of being there ). By looking at the responses to this question, clearly the data is not normally distributed. Medians are not centred and data is skewed in all conditions. All of the subjects rated the wide and stereo conditions above the midpoint. Both wide and stereo FOVs have outliers at the bottom. The means of the general presence are above the midpoint, with the wide-fov being the highest (M= 1.63, SD= 1.01), followed by the stereo-fov (M=1.13, SD= 1.4), and the narrow-fov (M= 0.25, SD= 1.62). Further, the analysis test (repeated measure ANOVA) shows that there 86

97 was a significant difference between the means, F (2, 46) = 7.77, p <.001. Both the Bonferroni and paired-samples test show 1) there was a significant difference between the narrow and stereo conditions t(23)=-2.69, p =.013, with stereo condition receiving a higher rating than the narrow condition; and 2) there was a significant difference between the wide and narrow conditions t(23)=3.45, p =.002, with the wide condition receiving a higher rating than the narrow condition. No significant difference was identified between the wide and stereo conditions (Table 6.14). All significance testing can be seen in appendix C, page 143. Table 6.14: General Presence Paired-Samples Test Paired Differences t df Sig. (2- Mean Std. Std. 95% Confidence tailed) Deviation Mean Error Interval of the Difference Lower Upper Pair 1 Narrow General Presence Stereo General Presence Pair 2 Wide General Presence Stereo General Presence Pair 3 Wide General Presence Narrow General Presence Overall sense of presence The previous section discussed each of the sense of presence factors separately. Table 6.15 shows a summary of the analysis findings. It represents the order of the conditions according to their means (from high to low) and the significant results of the repeated measure ANOVA and paired-samples t-test. Table 6.15: Sense of presence factors, summary of the findings Involvement Spatial presence Realism General presence Conditions orders according to their means Wide Stereo narrow Wide Stereo Narrow Wide Stereo Narrow Wide Stereo Narrow Repeated measure ANOVA Significant Significant Significant Significant Paired-Samples t-test (only significant results) Narrow-stereo Narrow-wide Narrow-stereo Narrow-wide Stereo-wide Narrow-stereo Narrow-wide Narrow-stereo Narrow-wide 87

98 The average of all sense of presence questions was calculated to measure the overall users sense of presence. The histograms in the following table (Table 6.15) show that the data fits nicely into the curve. This was also supported by the Kolmogorov-Smirnov and Shapiro-Wilk tests, except that the Kolmogorov test suggests abnormal distribution for the narrow-fov data. Overall Sense of Presence Table 6.16: Test of normality for users sense of presence Conditions Kolmogorov-Smirnov a Shapiro-Wilk Statistic df Sig. Statistic df Sig. Narrow-FOV Stereo-FOV Wide-FOV * Histograms Narrow-FOV Stereo-FOV Wide-FOV Overall Sense of Presence Overall Sense of Presence Overall Sense of Presence The following boxplot (Figure 6.8) represent the users overall sense of presence across the three conditions. It can be seen from the graph that narrow and stereo conditions have a similar range. Almost 75% of the rates are above the midpoint in the stereo-fov while only 50% in the narrow-fov. However, the stereo-fov has an outlier. Figure 6.8: Boxplot of the overall sense of presence for the three conditions 88

99 The wide-fov shows the smallest range, from -.38 to The subjects felt more sense of presence in the wide-fov (M=.90, SD= 1.36), followed by the stereo-fov (M=.38, SD= 1.31), and narrow-fov (M= -.28, SD= 1.13). The results of the statistical test, repeated measure ANOVA, shows that the assumption of the sphericity of the Mauchly s test had been met, X 2 (2) =.126, p =.939, therefore degrees of freedom were corrected using the sphericity assumed. The result indicates that the display type had a significant effect on the overall users sense of presence F (2, 46) = 17.46, p <.001, but a post hoc test was needed to identify where. Both the Bonferroni and paired-samples tests showed there were significant differences between all paired conditions. Hence, only the pairedsamples test will be reported. Results indicate there were significant differences: 1) between the narrow and stereo conditions t(23)=-3.20, p =.004, with stereo condition receiving a higher rating than the narrow condition; 2) between the wide and narrow conditions t(23)=5.78, p <.001, with the wide condition receiving higher rating than the narrow condition; and 3) between the wide and stereo conditions t(23)=2.76, p =.011 (Table 6.17). All statistical results can be seen in Appendix C, page 144). Table 6.17: Overall Sense of Presence Paired-Samples Test Mean Paired Differences Std. Deviation Std. Error Mean 95% Confidence Interval of the Difference Lower Upper t df Sig. (2- tailed ) Pair 1 Narrow-Overall Presence Stereo-Overall Presence Pair 2 Wide-Overall Presence Stereo-Overall Presence Pair 3 Wide-Overall Presence Narrow-Overall Presence

100 6.3.3 Comparison This section discusses the results of the comparative questionnaire. The comparative questionnaire contained seven questions. Each question had three answers (conditions names) and subjects had to choose only one of the answers. The first four questions were about subjects' perceptions in terms of their driving performance. The responses to these questions can be seen in the following graph and the number in each bar represents the number of participants who selected a certain answer (Figure 6.9). Figure 6.9: Subjects response to the first four questions of the comparative questionnaire In looking at the first question (With which system did you feel your overall performance was good?), the wide-fov was the most favoured with 54% of the responses, followed by the stereo-fov (29%), and narrow-fov (17%). For Question 2, on the other hand, (With which system did you think you completed the task faster than the others?), 41% of the subjects claimed to complete the task faster in the stereo-fov, compared to 37% in the wide-fov and 5% in the narrow-fov. In terms of wall and path collisions, Question 3 (With which system did you think you made fewer collisions with the walls?) showed a higher percentage in the wide-fov (46%) in comparison to the narrow-fov (29%) and stereo-fov (25%), and Question 4 (With which system did you think you made fewer collisions with path boundaries?) elicited a higher percentage of responses in the wide- FOV (38%) in comparison to the narrow-fov (33%) and stereo-fov (29%). The following graphs represent the responses of subjects to questions five and six (Figure 6.10). 90

101 Q5 Q6 Figure 6.10: Subjects responses to questions five (comfort) and six (involvement) of the comparative questionnaire The left pie graph shows subjects responses to Question 5 (With which system did you feel more comfortable?), with the wide-fov showing the highest count (17 out of 24 subjects), followed by the stereo-fov (4), and narrow-fov (3). The right pie chart represents the responses to Question 6 (With which system did you feel more involved?). No one chose the narrow-fov and the wide-fov took the majority of the responses, with 19 participants, whereas, only five participants felt involved in the stereo-fov. The last question (Question 7) considered subjects preferences. Subjects where simply asked which system they preferred. As can be seen in the next bar chart (Figure 6.11), 20 participants out of 24 chose the wide-fov condition. The rest were divided between the stereo-fov, three participants, and only one chose the narrow-fov. All subjects responses to all questions can be seen in Appendix C, page 145. Figure 6.11: Subjects responses to Question 7 (What system do you prefer?) 91

102 6.3.4 Simulator Sickness Simulator sickness as a confounding variable was measured using a standard simulator sickness questionnaire (SSQ), explained in Chapter 5 (page 66), Subjects had to answer the SSQ as part of the sense of presence questionnaire. Therefore, simulator sickness symptoms were measured across the three conditions. In this study, out of 16 questions (original SSQ), only four questions/symptoms were measured, including general discomfort, difficulty focusing, dizzy (eyes open), and dizzy (eyes closed). The score of these symptoms was then used to calculate and assess the three main aspects of the simulator sickness, namely nausea, oculomotor, and disorientation. The total score for each subject was based on these three aspects and represents the overall cybersickness experienced by the subject. A representation of the number of the subjects who experienced simulator sickness is shown in the following bar chart (Figure 6.12). Obviously, more subjects felt sickness symptoms in the wide-fov condition, in comparison to the narrow and stereo conditions. However, some degree of simulator sickness was experienced across all three conditions. Figure 6.12: Number of participants who experienced simulator sickness in each condition When calculating the SSQ scores (Table 6.18), the result indicates that the overall score in the wide-fov (2269) is more than twice the score for both the narrow and stereo FOVs (1028.6, 988.6, respectively). Surprisingly, the stereo- FOV caused less sickness than the narrow-fov, but it was higher in terms of nausea and oculomotor. Table 6.18: Simulator Sickness Scores Conditions Individual Symptoms score Nausea Oculomotor Disorientation Overall Score Narrow-FOV Stereo-FOV Wide-FOV

103 6.4 Discussion This chapter has presented the results of the users driving performance, users sense of presence, users perception and simulator sickness for the three conditions (narrow-fov, stereo-fov, and wide-fov). Analysis tests were performed to see if there were significant differences between the conditions, or not. These tests included one-way repeated measures ANOVA, the Bonferroni method and paired-samples t-test. This research produced 10 hypotheses to answer the research question (Does peripheral vision and/or stereoscopic (3D) viewing have an influence on PWC user driving performance?). Each of these hypotheses will be discussed in relation to the results, but before discussing the results and research hypotheses, a summary of the main findings is shown in the following table (Table 6.19). The table shows the order of the conditions according to their means, ANOVA results, the Bonferroni test, and paired-samples test. Table 6.19: Summary of the Study Results Conditions order by their means Repeated measure ANOVA Bonferroni test (Only significant) Paired-samples test (Only significant) Path Collisions Means from low to high Wide-FOV Stereo-FOV Narrow-FOV No significant result Test was not performed Test was not performed Wall collisions Means from low to high Wide-FOV Stereo-FOV Narrow-FOV No significant result Test was not performed Test was not performed Time spent Means from low to high Wide-FOV Narrow-FOV Stereo-FOV Significant (p =.009) Stereo & wide Narrow & wide Wide & stereo Overall performance Means from low to high Wide-FOV Narrow-FOV Stereo-FOV Significant (p =.017) Narrow & wide Narrow & wide Wide & stereo Sense of presence Means from high to low Wide-FOV Stereo-FOV Narrow-FOV Significant (p <.001) Narrow & wide Narrow & stereo Stereo & wide Narrow & wide Narrow & stereo Stereo & wide The test of normality suggested that some of the datasets were not normally distributed, in particular, wall collisions, time spent, and general presence data (pages 77, 79, and 83). However, with a small sample size, it is possible that the normality test may indicate abnormal distribution. Outliers also play an important part in affecting the normality test results. In general, data was normally distributed in most datasets. 93

104 In terms of path collisions, the study hypothesized that 1) the number of path collisions in the wide-fov will be lower than the number of path collisions in the narrow FOV; and 2) the number of path collisions in the stereo-fov will be lower than the number of path collisions in the narrow-fov. Although the results indicate that subjects made fewer collisions with the wide-fov and stereo- FOV conditions compared to the narrow-fov, the analysis test showed no significant results. Perhaps a larger sample would have shown a significant result. The question (With which system did you think you made fewer collisions with path boundaries?), subjects thought they made less collisions with the wide and stereo FOVs compared to the narrow-fov, but the result is very much balanced. Moreover, the representations of the path collisions (Figure 6.3, page 76) indicate that the number of path collisions seems to be reduced at some corners when the wide and stereo FOVs are used. Unlike the path collision results, the variances in wall collision results between conditions were more noticeable. This was obviously due to the different FOVs. The path is almost always in the user's FOV regardless of which condition is being used, while walls are more towards the periphery of the driver's central view and this has an immediate effect on what can be seen, depending on the condition. Generally, subjects made fewer wall collisions with the wide and stereo FOVs in comparison to the narrow FOV. This was also supported by the users responses when they were asked, With which system did you think you made fewer collisions with the wall?, but the analysis test reports no significant difference between the means. As a result, the hypotheses related to the wall collisions, H 3 (number of wall collisions in the wide-fov will be lower than the number of wall collisions in the narrow-fov) and H 4 (number of wall collisions in the stereo- FOV will be lower than the number of wall collisions in the narrow-fov), are not supported. When looking at task completion time, subjects completed the task significantly faster in the wide-fov than in the narrow-fov. This supports H 5, that time spent to accomplish the task in the wide-fov will be lower than time spent to accomplish the task in the narrow-fov. H 6, on the other hand, that time spent to accomplish the task in the stereo-fov will be lower than time spent to accomplish the task in the narrow-fov, was not supported. In fact, subjects completed the task faster in the narrow-fov than in the stereo-fov. The wow effect confounding variable could have potentially affected the result here. The fact that subjects spent more time in the stereo-fov condition may not have been caused by the display type, but rather that participants were more enjoying the system 94

105 and wanted to spend more time with it. This was also observed and the expression wow could be heard when some subjects first tried the stereo system. The overall driving performance result has shown that the wide-fov score was significantly higher than the score for the narrow-fov. Also, subjects showed better performance in the narrow-fov, compared to the stereo-fov. Because the performance score was a calculation of path and wall collisions and time spent, the result could have also been affected by the results of the time spent. Two hypotheses were defined for overall users driving performance. They were H 7 ) performance score in the wide-fov will be higher than performance score in the narrow-fov; and, H 8 ) performance score in the stereo-fov will be higher than performance score in the narrow-fov. The former was supported while the latter was not, and the stereo-fov score was even lower than the narrow-fov score. In addition, the first hypothesis was also supported by the perception of the subjects that their overall driving performance in the wide-fov was better than in the narrow. The last two hypotheses were concerned with the users sense of presence. The hypotheses were H 9 ) sense of presence in the wide-fov will be higher than sense of presence in the narrow-fov; and, H 10 ) sense of presence in the stereo- FOV will be higher than sense of presence in the narrow-fov. Both of these hypotheses were supported. The detailed analysis of the sense of presence factors also showed that the wide- and stereo-fovs significantly increased user sense of presence in terms of involvement, spatial presence and realism. In fact, none of the subjects chose the narrow-fov when they were asked which system they felt more involved with, which was also predictable. Some participants experienced a degree of simulator sickness. This was predicted, and it is to be expected that the more immersive the VE is, the more likely an individual will feel simulator sickness symptoms. The results showed that the simulator sickness score for the wide-fov was higher, compared to the other conditions.. This finding is completely at odds with the subjects' answers to Question 6 in the comparative questionnaire, where 70% of subjects chose the wide-fov as the most comfortable system. However, in re-examining the data, it was found that seven out of the nine participants who experienced simulator sickness did not choose the wide-fov as the most comfortable and the rest (17 participants) did select the wide-fov. Further, an unexpected result was that subjects who felt sick while using the wide-fov system also had severe disorientation. 95

106 Generally, it was found that participants performed better in the wide-fov, but there is a high chance that they will feel sick. It is difficult to tell if stereoscopic vision had a positive effect on user performance or not, and perhaps a larger sample is needed to find this out. As indicated in the results section, the majority of subjects preferred the wide-fov over the narrow- and stereo-fovs. The results of this study may provide some guidelines for the design of PWC simulators or other virtual environment systems. However, this study has been subject to some limitations, which will be discussed in the conclusion and outline of future work in the next chapter. Finally, a summary of the hypotheses test is presented in the following table (Table 6.20). Table 6.20: Summary of the assessment of research hypotheses No Hypotheses Conclusion H 1 Number of path collisions in the wide-fov will be lower than number of path collisions in the narrow-fov Not supported H 2 Number of path collisions in the stereo-fov will be lower than number of path collisions in the narrow-fov Not supported H 3 Number of wall collisions in the wide-fov will be lower than number of path collisions in the narrow-fov Not supported H 4 Number of wall collisions in the stereo-fov will be lower than number of path collisions in the narrow-fov Not supported H 5 Time spent to accomplish the task in the wide-fov will be lower than time spent to accomplish the task in the narrow-fov Supported H 6 Time spent to accomplish the task in the stereo-fov will be lower than time spent to accomplish the task in the narrow-fov Not supported H 7 Performance score in the wide-fov will be higher than performance score in the narrow-fov Supported H 8 H 9 Performance score in the stereo-fov will be higher than performance score in the narrow-fov User sense of presence in the wide-fov will be higher than user sense of presence in the narrow-fov Not supported Supported H 10 User sense of presence in the stereo-fov will be higher than user sense of presence in the narrow-fov Supported 96

107 Chapter 7 7 Conclusion and Future Work This chapter summarizes the thesis project. It will discuss the conclusions arrived at, the contribution of the study towards better power wheelchair simulation, and potential future research. 7.1 Conclusion Previous research has shown that PWC simulators hold great promise in the assessment and/or training of disabled persons. But the review of the related works and the findings of the heuristic evaluation of the only commercial PWC simulator on the market has shown that current PWC simulators 1) are rather simple, in particular lack correct physics simulation; 2) do not support peripheral vision; and 3) are not suitable as a standard assessment tool. This research took these limitations into account and identified possible factors that influence users driving performance and sense of presence in such a system. It addresses the central research question: Do peripheral vision and/or stereoscopic vision have an influence on PWC driving performance? In order to answer the research question, this issue was investigated by treating standard display viewing, peripheral vision and stereoscopic 3D viewing as independent variables and comparing the performance of users across the three conditions. Ten hypotheses were formulated to answer the research question. In general, the hypotheses claimed that the wide field of view and stereoscopic displays would improve the driving performance of users and increase their sense of presence over a standard display (monoscopic narrow field of view). Three different versions of a PWC simulator were developed and used to evaluate users driving performance, in a comparative experimental design. The research question that formed the basis of this study could not be completely answered. The results have shown that the subjects performed best on average with the wide field of view. They had fewer collisions with the walls and path boundaries, were significantly faster, scored significantly higher in overall driving performance, and their sense of presence was significantly increased, in comparison with the other conditions. The wide field of view was also the most preferred and was perceived as the most 97

108 suitable for the navigation task. Offsetting this, there was a much higher likelihood the subjects would experience symptoms of simulator sickness with the wide field of view condition. The study has also shown that subjects performed better with the stereoscopic display compared to the monoscopic condition (narrow-fov). They had fewer collisions with walls and path boundaries, had higher scores in overall driving performance, and a significantly higher sense of presence, but their task completion time was slower. However, the differences between the stereo and narrow field of view were not that great in terms of the driving performances and preferences of subjects, and simulator sickness scores. The findings that we have identified can assist in the building of better simulators, in particular PWC simulators. In general, though, it seems that if simulator sickness is a concern, the wide field of view cannot be an ideal environment, with a narrower field of view being more preferable. Perhaps a stereoscopic narrow field of view would increase the users sense of presence, as well. But if user preference is the main concern, such as with respect to game and navigation systems, then definitely the wide field of view is the target. The present study, also, makes several noteworthy contributions to the realm of PWC simulators. First, it made use of the simulator data to assess individual driving performances. Second, it made use of some of the data to visualize user performance, such as the locations of the path collisions and user trajectories. Third, the navigation task chosen for the simulator proved to be easy to follow and represent reality. The framework built for this study also proved to be easy, cost-effective, and adjustable. It could be used for other investigations. The following conclusions can be drawn from the present study: that the wide field of view could be a promising interface for complementing training or assessment of PWC users, but simulator sickness needs to be highly considered. The outcome of this research should not only benefit PWC simulator designers or developers, but also contribute to other fields of study in terms of visualization and interaction, such as simulation systems, game design, navigation systems, computer graphics, etc. Given the results of the user performance, sense of presence, perception, and preference of the display types should assist on selecting the right display for targeted people. The study findings might be also applicable to other kinds of vehicle simulators as well, like virtual cars, airplanes, vessels, bicycles, etc. 98

109 Finally, a number of limitations need to be acknowledged. First, the research was limited by the technology used and its availability to the Department of Information Science at the University of Otago. The most important limitation lies in the fact that the research sample was not made up of user groups, but rather of able-bodied subjects. Further, another limitation was the sample size, with the number of participants being relatively small. Comments by several subjects that the display was bright and the colour needed to be improved indicated a limitation related to the material used (laminated plastic) to display the projected video therefore having an impact on both user performance and sense of presence. The lack of sound, such as PWC sound, in the simulator was also reported as a limitation by several subjects. A condition that was not addressed in this study is whether a combination of wide field of view and stereoscopic vision in the centre will also influence user driving performance and sense of presence. The possibilities of this combined condition may be an area for further research. 7.2 Future Works This research has thrown up many questions that necessitate further investigation. This study investigated two main factors that influence driving performance in a PWC simulator, these being field of view and display type, but further research might explore/investigate other factors. User driving performance could be looked at from different perspectives. Some examples include different virtual environments, such as indoor and outdoor, different PWC models, such as front, back, and middle-wheeled, or different navigation tasks. These areas would establish a base for better understanding of PWC simulators. In terms of simulator design, more research into physical simulation, interaction, and visualization is important and it is actually the first step towards a reliable assessment and training simulation. In fact, these two aspects cannot be separated, and it would seem that in order to assess an individual ability to drive a PWC, training must come first. However, determining when a person is ready to be assessed is an interesting research area in itself. Do they receive training before assessment takes place? Or does the assessment procedure have different stages and no need for training? This, of course, is one side if we already have the right scale or method to evaluate an individual, but the fact is, there is no standard assessment tool, not even a standard traditional tool to assess individual ability, and this is usually based on observation and guesswork. Therefore, further work needs to be done to establish 99

110 a standardised assessment tool. This can take different directions. One is to review all the current traditional assessments tools and investigate what can and cannot be implemented in a PWC simulator. Perhaps new scales of measurement can be introduced since we can obtain additional information from the simulator that was not available from real life observation and measurement, and experimentally test it. Moreover, there could be additional data that can be extracted from the simulator, such as joystick movement. The data recorded from the simulator and later presented in a visual form, such as path collision locations and user trajectories, as shown in this research, could be also used by therapists to strengthen their educated guesses, or support the simulator assessment results, for example. It could be also used to assess environment accessibility and whether specific dimensions are handicapped accessible. This will allow for earlier assessment in the design of new buildings. 100

111 References [1] Americans With Disabilities Act Of 1990, As Amended. [Online]. Available: [Accessed: 24-Oct-2012]. [2] T. Pithon, T. Weiss, S. Richir, and E. Klinger, Wheelchair simulators: A review, Technology and Disability, vol. 21, no. 1, pp. 1 10, Jan [3] J. E. Swan, D. Stredney, W. Carlson, and B. Blostein, The Determination of Wheelchair User Proficiency and Environmental Accessibility through Virtual Simulation, in Proceedings of the 2nd Annual International Conference: Virtual Reality and Persons with Disabilities, California, 1994, pp [4] P. Abellard, I. Randria, A. Abellard, M. M. Ben Khelifa, and P. Ramanantsizehe, Electric Wheelchair Navigation Simulators: why, when, how?, in Mechatronic Systems Applications, A. M. D. Di Paola and G. Cicirelli, Eds. InTech, [5] P. S. Archambault, S. Tremblay, S. Cachecho, F. Routhier, and P. Boissy, Driving performance in a power wheelchair simulator, Disabil Rehabil Assist Technol, vol. 7, no. 3, pp , May [6] A. Harrison, G. Derwent, A. Enticknap, F. D. Rose, and E. A. Attree, The role of virtual reality technology in the assessment and training of inexperienced powered wheelchair users, Disability & Rehabilitation, vol. 24, no , pp , Jan [7] H. Niniss and T. Inoue, Electric Wheelchair Simulator for Rehabilitation of Persons with Motor Disability, National Rehabilitation Center for Persons with Disabilities 4-1, Namiki, Tokorozawa, JAPAN, [8] R. Ball and C. North, The effects of peripheral vision and physical navigation on large scale visualization, in Proceedings of Graphics Interface 2008, Toronto, Ont., Canada, Canada, 2008, pp [9] M. Czerwinski, D. S. Tan, and G. G. Robertson, Women take a wider view, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 2002, pp [10] T. Schubert, F. Friedmann, and H. Regenbrecht, The Experience of Presence: Factor Analytic Insights, Presence: Teleoper. Virtual Environ., vol. 10, no. 3, pp , Jun [11] M. Herrlich, R. Meyer, R. Malaka, and H. Heck, Development of a Virtual Electric Wheelchair Simulation and Assessment of Physical Fidelity Using the Unreal Engine 3, in Entertainment Computing ICEC 2010, vol. 6243, Springer Berlin / Heidelberg, 2010, pp [12] P. M. Grant, C. S. Harrison, and B. A. Conway, Wheelchair simulation, in Designing a more inclusive world, Cambridge workshops on universal access and assistive technology:, 2004, pp [13] D. A. Bowman, E. Kruijff, J. J. L. Jr, and I. Poupyrev, 3D User Interfaces: Theory and Practice, 1 st ed. Addison-Wesley Professional,

112 [14] R. E. Shannon, Introduction to the art and science of simulation, in Proceedings of the 30 th conference on Winter simulation, Los Alamitos, CA, USA, 1998, pp [15] D. Browning, C. Cruz-Neira, D. Sandin, and T. DeFanti, Projection-Based Virtual Environments and Disability, in Proceedings of the First Annual International Conference: Virtual Reality and People with Disabilities, [16] H. McLellan, Virtual realities, in The handbook of research for educational communications and technology, New York: Simon and Schuster MacMillan, 1996, pp [17] C. E. Loeffler and T. Anderson, What is Virtual Reality?, in The Virtual reality casebook, New York: Van Nostrand, 1994, p. xii xxv. [18] H. Regenbrecht and T. Schubert, Real and Illusory Interactions Enhance Presence in Virtual Environments, Presence: Teleoperators and Virtual Environments, vol. 11, no. 4, pp , Aug [19] B. Woolley, SIMULATION, in Virtual Worlds: A Journey in Hype and HyperReality, Benjamin Woolley, 1993, pp [20] M. Slater, V. Linakis, M. Usoh, R. Kooper, and G. Street, Immersion, Presence, and Performance in Virtual Environments: An Experiment with Tri-Dimensional Chess, in ACM Virtual Reality Software and Technology (VRST, 1996, pp [21] B. G. Witmer and M. J. Singer, Measuring Presence in Virtual Environments: A Presence Questionnaire, Presence: Teleoper. Virtual Environ., vol. 7, no. 3, pp , Jun [22] M. J. Schuemie, P. van der Straaten, M. Krijn, and C. A. van der Mast, Research on presence in virtual reality: a survey, Cyberpsychol Behav, vol. 4, no. 2, pp , Apr [23] H. T. Regenbrecht, T. W. Schubert, and F. Friedmann, Measuring the Sense of Presence and its Relations to Fear of Heights in Virtual Environments, International Journal of Human-Computer Interaction, vol. 10, no. 3, pp , [24] igroup presence questionnaire (IPQ) overview. [Online]. Available: [Accessed: 17-Oct-2012]. [25] T. Ogle, The Effects of Virtual Environments on Recall in Participants of Differing Levels of Field Dependence, Virginia Polytechnic and State University, Blacksburg, Virginia, [26] J. Kjeldskov, Combining interaction techniques and display types for virtual reality, presented at the OZCHI: Annual Conference of the Australian Computer- Human Interaction Special Interest Group, Australia, 2001, pp [27] H. Gross, F. Blechinger, and B. Achtner, Human Eye, in Handbook of Optical Systems: Survey of Optical Instruments, vol. 4, Wiley VCH, 2008, pp

113 [28] E. B. Goldstein, Separate Visual Systems for Action and Perception, in Sensation and Perception, Blackwell, 2005, pp [29] Wheelchair, Wikipedia, the free encyclopedia, 24-Nov [Online]. Available: [Accessed: 30-Nov-2012]. [30] Manual v. Power Wheelchairs. [Online]. Available: [Accessed: 10-Oct-2012]. [31] A Guide to Electric Wheelchairs. [Online]. Available: [Accessed: 10- Oct-2012]. [32] C. N. Pronk, P. C. de Klerk, A. Schouten, J. L. Grashuis, R. Niesing, and B. D. Bangma, Electric wheelchair simulator as a man-machine system, Scand J Rehabil Med, vol. 12, no. 3, pp , [33] R. A. Cooper, D. Ding, R. Simpson, S. G. Fitzgerald, D. M. Spaeth, S. Guo, A. M. Koontz, R. Cooper, J. Kim, and M. L. Boninger, Virtual reality and computerenhanced training applied to wheeled mobility: an overview of work in Pittsburgh, Assist Technol, vol. 17, no. 2, pp , [34] M. Desbonnet, S. Cox, and A. Rahman, Development and evaluation of a virtual reality based training system for disabled children, presented at the The 2 nd European Conference on Disability, Virtual Reality and Associated Technologies, Skövde, Sweden, 1998, pp [35] A. Hasdai, A. S. Jessel, and P. L. Weiss, Use of a computer simulator for training children with disabilities in the operation of a powered wheelchair, Am J Occup Ther, vol. 52, no. 3, pp , Mar [36] I. Adelola, S. Cox, and Rahman, Adaptable virtual reality interface for powered wheelchair training of disabled children, in Proceedings of The Fourth International Conference on Disability, Virtual Reality and Associated Technologies, Veszprém, Hungary, 2002, pp [37] D. P. Inman, K. Loge, A. Cram, and M. Peterson, Learning To Drive a Wheelchair in Virtual Reality, Journal of Special Education Technology, vol. 26, no. 3, pp , [38] D. Browning, C. Cruz-Neira, D. Sandin, T. DeFanti, and J. G. Edel, Input Interfacing to the CAVE by Persons with Disabilities, presented at the CSUN s Annual International Conference, Carolina State University, [39] I. Randria, A. Abellard, M. Ben Khelifa, P. Abellard, and P. Ramanantsizehena, Evaluation of Trajectory Applied to Collaborative Rehabilitation For a Wheelchair Driving Simulator, in 4th European Conference of the International Federation for Medical and Biological Engineering, vol. 22, J. Sloten, P. Verdonck, M. Nyssen, J. Haueisen, and R. Magjarevic, Eds. Springer Berlin Heidelberg, 2009, pp [40] H. Niniss and A. Nadif, Simulation of the behaviour of a powered wheelchair using virtual reality, in The 3 rd international conference on disability, virtual reality and associated technologies., Sardinia, Italy, 2000, pp

114 [41] Methods of virtual reality, Wikipedia, the free encyclopedia, 22-Nov [Online]. Available: [Accessed: 30-Nov-2012]. [42] C. Harrison, P. Dall, P. Grant, M. Granat, T. Maver, and C. Bernard, Development of a wheelchair virtual reality platform for use in evaluating wheelchair access, in Proceedings of the Third International Conference on Disability, Alghero, Italy, [43] D. Bowman, J. L. Gabbard, and D. Hix, A Survey of Usability Evaluation in Virtual Environments: Classification and Comparison of Methods. PRESENCE: Teleoperators and Virtual Environments, 2002, pp [44] A. Sutcliffe and B. Gault, Heuristic evaluation of virtual reality applications, Interacting with Computers, vol. 16, no. 4, pp , Aug [45] A. Sutcliffe, B. Gault, and J.-E. Shin, Presence, memory and interaction in virtual environments, Int. J. Hum.-Comput. Stud., vol. 62, no. 3, pp , Mar [46] I. Connell, Full Principles Set, Set of 30 usability evaluation principles compiled by the author from the HCI literature, [Online]. Available: [Accessed: 02-Jun- 2012]. [47] J. Nielsen and R. Molich, Heuristic evaluation of user interfaces, in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA, 1990, pp [48] WheelSim. LifeTool. V1.2, ISBN , software on CD [49] J. Desmyter, S. Garvin, P. Lefèbvre, F. Stirano, and A. Vaturi, T1. 4 A review of safety, security accessibility and positive stimulation indicators, in Seventh Framework Programme, [50] G. Pires, N. Honorio, C. Lopes, U. Nunes, and A. T. Almeida, Autonomous wheelchair for disabled people, in, Proceedings of the IEEE International Symposium on Industrial Electronics, ISIE 97, 1997, pp vol.3. [51] S. Bruck and P. A. Watters, Estimating Cybersickness of Simulated Motion Using the Simulator Sickness Questionnaire (SSQ): A Controlled Study, in Sixth International Conference on Computer Graphics, Imaging and Visualization, CGIV 09, 2009, pp [52] D. M. Johnson, Simulator Sickness Research Summary 1, U.S. Army Research Institute for the Behavioral and Social Science Ft., Rucker, Alabama USA, N/A, [53] R. S. Kennedy, N. E. Lane, K. S. Berbaum, and M. G. Lilienthal, Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness, The International Journal of Aviation Psychology, vol. 3, no. 3, pp , [54] W. A. IJsselsteijn, H. de Ridder, J. Freeman, and S. E. Avons, Presence: concept, 104

115 determinants, and measurement, in Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, 2000, vol. 3959, pp [55] J. Pallant, SPSS survival manual. Maidenhead: McGraw Hill, [56] A. P. Field, Discovering statistics using SPSS. Los Angeles, [Calif.]; London: SAGE,

116 Appendix A: Heuristic Evaluation Documents VE-Heuristics (one pages, 107) Heuristic Evaluation task (one page, 108) 106

117 VE-Heuristics The descriptions are derived from other studies. The reference for each description is noted at the end of each statement. 1. Navigation and orientation support: The users should always be able to find where they are in the VE and return to known, present positions [45]. 2. Responsiveness: System components that are physically moveable by the user should present no resistance. There should be minimum delay in the initiation (as opposed to processing time) of system processes [46]. 3. Realistic feedback: The effect of the user s actions on virtual world objects should be immediately visible and conform to the laws of physics and the user s perceptual expectations [45]. 4. Flexibility: In mixed-input systems (e.g. allowing combinations of mouse, keyboard and joystick), it may be possible to perform most operations using more than one input mode [46]. A user may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users [47]. 5. Natural engagement: Interaction should approach the user s expectation of interaction in the real world as far as possible. Ideally, the user should be unaware that the reality is virtual [44]. 6. Compatibility with the user s task and domain: The VE and behaviour of objects should correspond as closely as possible to the user s expectation of real-world objects; their behaviour; and affordances for task action [44]. 7. Perceptual clarity: All graphical objects should be both discernible and distinguishable from other objects [46]. 8. Help users recognize and recover from collision: User should be able to distinguish between collisions and just being close to objects in a virtual environment, adapted from [47]. 9. Sense of presence: The user should feel part of the virtual environment/ belong to the VE. 10. Field of view Viewpoints: The visual representation of the virtual world should be close to the user s normal vision [45], and the viewpoint change by user input should be rendered without delay [44]. 107

118 Heuristic Evaluation Task Introduction WheelSim is a power wheelchair simulator (PWC) that was designed to help disabled persons operate a PWC using a standard joystick (keyboard arrow keys and mouse can also be used to navigate through the environment). A heuristic evaluation will be conducted where a list of 10 heuristics will be used as a guideline. The main goal of this evaluation is to find flaws. The result should be a list of problems that match one or more of the 10 heuristics. In each finding the following dimensions should be addressed: heuristics violated severity, location, usability issue, and recommendation. The following table contains an example of a usability issue finding. Heuristic violated number 4 Example Severity 1= high, 2= medium, 3= low Location (screen name) 1 Wheelchair simulationlevel A, B, C and D Issue description Hard to drive the wheelchair through the mouse device Recommendation It is unrealistic to drive the wheelchair through the mouse movement. Mouse should be used to operate other functions such as user interface Task 1. Read the 10 heuristics and severity rating; you can refer to them as many times as you like while evaluating the system. 2. The simulation should be evaluated in terms of user point of view (PWC driver). 3. Remember the aim of this evaluation is to find flaws. 4. The entire evaluation should be performed against all of the following tasks: Run WheelSim application and try to get an overall impression. Choose level A, navigate through the environment using the joystick as an input device, and consider these principles: 1) navigation and orientation, 2) responsiveness, 3) realistic feedback. Try using other input devices such as keyboard arrow keys and mouse movement while considering flexibility and natural engagement principles. Change the level from A to B, try to move along and stay between the yellow guidelines while considering 1) compatibility with the user s task and domain, 2) perceptual clarity. Now, try to drive out of the yellow lines and make collisions with different objects while considering 1) help user recognize and recover from collision, 2) scene of presence. Move to level C, try to drive between the guidelines while changing the speed of the wheelchair (press fire button in the joystick) and consider these principles: 1) faithful viewpoint and field of view, 2) flexibility. Close level C and choose level D, choose your own task while considering any principle. 108

119 Appendix B: Experiment Documents This appendix contains all documents related to the experiment including Ethical approval (two pages, ) Instructions for experimenter (three pages, ) Participant information sheet (one page, 115) Consent form (one page, 116) Demographic questionnaire (one page, 117) Task description (one page, 118) Session self-report sheet (one page, 119) Sense of presence and sickness questionnaire (three pages, ) Comparative questionnaire (one page, 123) 109

120 110

121 111

122 112

123 113

124 114

125 115

126 116

127 117

128 118

Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator system

Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator system Influence of peripheral and stereoscopic vision on driving performance in a power wheelchair simulator system Abdulaziz Alshaer, Simon Hoermann, Holger Regenbrecht Department of Information Science University

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE. Towards Virtual Occupancy Evaluation in Designed Environments (VOE)

VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE. Towards Virtual Occupancy Evaluation in Designed Environments (VOE) VIRTUAL ENVIRONMENTS FOR THE EVALUATION OF HUMAN PERFORMANCE Towards Virtual Occupancy Evaluation in Designed Environments (VOE) O. PALMON, M. SAHAR, L.P.WIESS Laboratory for Innovations in Rehabilitation

More information

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS Announcements Homework project 2 Due tomorrow May 5 at 2pm To be demonstrated in VR lab B210 Even hour teams start at 2pm Odd hour teams start

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL. Overview In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated

More information

COPYRIGHTED MATERIAL OVERVIEW 1

COPYRIGHTED MATERIAL OVERVIEW 1 OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,

More information

Virtual Reality Based Scalable Framework for Travel Planning and Training

Virtual Reality Based Scalable Framework for Travel Planning and Training Virtual Reality Based Scalable Framework for Travel Planning and Training Loren Abdulezer, Jason DaSilva Evolving Technologies Corporation, AXS Lab, Inc. la@evolvingtech.com, jdasilvax@gmail.com Abstract

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR HCI and Design Admin Reminder: Assignment 4 Due Thursday before class Questions? Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR 3D Interfaces We

More information

Perception in Immersive Environments

Perception in Immersive Environments Perception in Immersive Environments Scott Kuhl Department of Computer Science Augsburg College scott@kuhlweb.com Abstract Immersive environment (virtual reality) systems provide a unique way for researchers

More information

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne Introduction to HCI CS4HC3 / SE4HC3/ SE6DO3 Fall 2011 Instructor: Kevin Browne brownek@mcmaster.ca Slide content is based heavily on Chapter 1 of the textbook: Designing the User Interface: Strategies

More information

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e. VR-programming To drive enhanced virtual reality display setups like responsive workbenches walls head-mounted displays boomes domes caves Fish Tank VR Monitor-based systems Use i.e. shutter glasses 3D

More information

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient CYBERPSYCHOLOGY & BEHAVIOR Volume 5, Number 2, 2002 Mary Ann Liebert, Inc. Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient JEONG H. KU, M.S., 1 DONG P. JANG, Ph.D.,

More information

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, MANUSCRIPT ID 1 Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task Eric D. Ragan, Regis

More information

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

A Method for Quantifying the Benefits of Immersion Using the CAVE

A Method for Quantifying the Benefits of Immersion Using the CAVE A Method for Quantifying the Benefits of Immersion Using the CAVE Abstract Immersive virtual environments (VEs) have often been described as a technology looking for an application. Part of the reluctance

More information

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K.

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K. THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION Michael J. Flannagan Michael Sivak Julie K. Simpson The University of Michigan Transportation Research Institute Ann

More information

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN 8.1 Introduction This chapter gives a brief overview of the field of research methodology. It contains a review of a variety of research perspectives and approaches

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Output Devices - Visual

Output Devices - Visual IMGD 5100: Immersive HCI Output Devices - Visual Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Overview Here we are concerned with technology

More information

A Cost-Effective Virtual Environment for Simulating and Training Powered Wheelchairs Manoeuvres

A Cost-Effective Virtual Environment for Simulating and Training Powered Wheelchairs Manoeuvres A Cost-Effective Virtual Environment for Simulating and Training Powered Wheelchairs Manoeuvres Christopher J. HEADLEAND a, Thomas DAY b, Serban R. POP b, Panagiotis D. RITSOS b and Nigel W. JOHN b,1 a

More information

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger.

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger. Scholarly Article Review The Potential of Using Virtual Reality Technology in Physical Activity Settings Aaron Krieger October 22, 2015 The Potential of Using Virtual Reality Technology in Physical Activity

More information

Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli

Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli 6.1 Introduction Chapters 4 and 5 have shown that motion sickness and vection can be manipulated separately

More information

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology

More information

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? # / Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? # Dr. Jérôme Royan Definitions / 2 Virtual Reality definition «The Virtual reality is a scientific and technical domain

More information

The development of a virtual laboratory based on Unreal Engine 4

The development of a virtual laboratory based on Unreal Engine 4 The development of a virtual laboratory based on Unreal Engine 4 D A Sheverev 1 and I N Kozlova 1 1 Samara National Research University, Moskovskoye shosse 34А, Samara, Russia, 443086 Abstract. In our

More information

Immersive Simulation in Instructional Design Studios

Immersive Simulation in Instructional Design Studios Blucher Design Proceedings Dezembro de 2014, Volume 1, Número 8 www.proceedings.blucher.com.br/evento/sigradi2014 Immersive Simulation in Instructional Design Studios Antonieta Angulo Ball State University,

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

A Hybrid Immersive / Non-Immersive

A Hybrid Immersive / Non-Immersive A Hybrid Immersive / Non-Immersive Virtual Environment Workstation N96-057 Department of the Navy Report Number 97268 Awz~POved *om prwihc?e1oaa Submitted by: Fakespace, Inc. 241 Polaris Ave. Mountain

More information

Initial Report on Wheelesley: A Robotic Wheelchair System

Initial Report on Wheelesley: A Robotic Wheelchair System Initial Report on Wheelesley: A Robotic Wheelchair System Holly A. Yanco *, Anna Hazel, Alison Peacock, Suzanna Smith, and Harriet Wintermute Department of Computer Science Wellesley College Wellesley,

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Mid-term report - Virtual reality and spatial mobility

Mid-term report - Virtual reality and spatial mobility Mid-term report - Virtual reality and spatial mobility Jarl Erik Cedergren & Stian Kongsvik October 10, 2017 The group members: - Jarl Erik Cedergren (jarlec@uio.no) - Stian Kongsvik (stiako@uio.no) 1

More information

Enhancing Fish Tank VR

Enhancing Fish Tank VR Enhancing Fish Tank VR Jurriaan D. Mulder, Robert van Liere Center for Mathematics and Computer Science CWI Amsterdam, the Netherlands mullie robertl @cwi.nl Abstract Fish tank VR systems provide head

More information

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Issues and Challenges of 3D User Interfaces: Effects of Distraction Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an

More information

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System Driver Education Classroom and In-Car Instruction Unit 3-2 Unit Introduction Unit 3 will introduce operator procedural and

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Techniques for Generating Sudoku Instances

Techniques for Generating Sudoku Instances Chapter Techniques for Generating Sudoku Instances Overview Sudoku puzzles become worldwide popular among many players in different intellectual levels. In this chapter, we are going to discuss different

More information

VR Haptic Interfaces for Teleoperation : an Evaluation Study

VR Haptic Interfaces for Teleoperation : an Evaluation Study VR Haptic Interfaces for Teleoperation : an Evaluation Study Renaud Ott, Mario Gutiérrez, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory Ecole Polytechnique Fédérale de Lausanne (EPFL) CH-1015

More information

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Team Breaking Bat Architecture Design Specification. Virtual Slugger Department of Computer Science and Engineering The University of Texas at Arlington Team Breaking Bat Architecture Design Specification Virtual Slugger Team Members: Sean Gibeault Brandon Auwaerter Ehidiamen

More information

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers

Leading the Agenda. Everyday technology: A focus group with children, young people and their carers Leading the Agenda Everyday technology: A focus group with children, young people and their carers March 2018 1 1.0 Introduction Assistive technology is an umbrella term that includes assistive, adaptive,

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information

Do 3D Stereoscopic Virtual Environments Improve the Effectiveness of Mental Rotation Training?

Do 3D Stereoscopic Virtual Environments Improve the Effectiveness of Mental Rotation Training? Do 3D Stereoscopic Virtual Environments Improve the Effectiveness of Mental Rotation Training? James Quintana, Kevin Stein, Youngung Shon, and Sara McMains* *corresponding author Department of Mechanical

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based Camera

Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based Camera The 15th IEEE/ACM International Symposium on Distributed Simulation and Real Time Applications Controlling Viewpoint from Markerless Head Tracking in an Immersive Ball Game Using a Commodity Depth Based

More information

R (2) Controlling System Application with hands by identifying movements through Camera

R (2) Controlling System Application with hands by identifying movements through Camera R (2) N (5) Oral (3) Total (10) Dated Sign Assignment Group: C Problem Definition: Controlling System Application with hands by identifying movements through Camera Prerequisite: 1. Web Cam Connectivity

More information

House Design Tutorial

House Design Tutorial House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have created a

More information

The Representational Effect in Complex Systems: A Distributed Representation Approach

The Representational Effect in Complex Systems: A Distributed Representation Approach 1 The Representational Effect in Complex Systems: A Distributed Representation Approach Johnny Chuah (chuah.5@osu.edu) The Ohio State University 204 Lazenby Hall, 1827 Neil Avenue, Columbus, OH 43210,

More information

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS Xianjun Sam Zheng, George W. McConkie, and Benjamin Schaeffer Beckman Institute, University of Illinois at Urbana Champaign This present

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D.

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. CSE 190: 3D User Interaction Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. 2 Announcements Final Exam Tuesday, March 19 th, 11:30am-2:30pm, CSE 2154 Sid s office hours in lab 260 this week CAPE

More information

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment

A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment S S symmetry Article A Study on Interaction of Gaze Pointer-Based User Interface in Mobile Virtual Reality Environment Mingyu Kim, Jiwon Lee ID, Changyu Jeon and Jinmo Kim * ID Department of Software,

More information

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills O Lahav and D Mioduser School of Education, Tel Aviv University,

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments Weidong Huang 1, Leila Alem 1, and Franco Tecchia 2 1 CSIRO, Australia 2 PERCRO - Scuola Superiore Sant Anna, Italy {Tony.Huang,Leila.Alem}@csiro.au,

More information

Attorney Docket No Date: 25 April 2008

Attorney Docket No Date: 25 April 2008 DEPARTMENT OF THE NAVY NAVAL UNDERSEA WARFARE CENTER DIVISION NEWPORT OFFICE OF COUNSEL PHONE: (401) 832-3653 FAX: (401) 832-4432 NEWPORT DSN: 432-3853 Attorney Docket No. 98580 Date: 25 April 2008 The

More information

The Perception of Optical Flow in Driving Simulators

The Perception of Optical Flow in Driving Simulators University of Iowa Iowa Research Online Driving Assessment Conference 2009 Driving Assessment Conference Jun 23rd, 12:00 AM The Perception of Optical Flow in Driving Simulators Zhishuai Yin Northeastern

More information

Image Characteristics and Their Effect on Driving Simulator Validity

Image Characteristics and Their Effect on Driving Simulator Validity University of Iowa Iowa Research Online Driving Assessment Conference 2001 Driving Assessment Conference Aug 16th, 12:00 AM Image Characteristics and Their Effect on Driving Simulator Validity Hamish Jamson

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.23 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

The effect of 3D audio and other audio techniques on virtual reality experience

The effect of 3D audio and other audio techniques on virtual reality experience The effect of 3D audio and other audio techniques on virtual reality experience Willem-Paul BRINKMAN a,1, Allart R.D. HOEKSTRA a, René van EGMOND a a Delft University of Technology, The Netherlands Abstract.

More information

House Design Tutorial

House Design Tutorial Chapter 2: House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have

More information

House Design Tutorial

House Design Tutorial Chapter 2: House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have

More information

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY 1 RAJU RATHOD, 2 GEORGE PHILIP.C, 3 VIJAY KUMAR B.P 1,2,3 MSRIT Bangalore Abstract- To ensure the best place, position,

More information

Virtual environments as an aid to the design and evaluation of home and work settings for people with physical disabilities

Virtual environments as an aid to the design and evaluation of home and work settings for people with physical disabilities Virtual environments as an aid to the design and evaluation of home and work settings for people with physical disabilities O Palmon 1, R Oxman 2, M Shahar 3 and P L Weiss 4 1,3,4 Laboratory for Innovations

More information

ABSTRACT. A usability study was used to measure user performance and user preferences for

ABSTRACT. A usability study was used to measure user performance and user preferences for Usability Studies In Virtual And Traditional Computer Aided Design Environments For Spatial Awareness Dr. Syed Adeel Ahmed, Xavier University of Louisiana, USA ABSTRACT A usability study was used to measure

More information

Standard for metadata configuration to match scale and color difference among heterogeneous MR devices

Standard for metadata configuration to match scale and color difference among heterogeneous MR devices Standard for metadata configuration to match scale and color difference among heterogeneous MR devices ISO-IEC JTC 1 SC 24 WG 9 Meetings, Jan., 2019 Seoul, Korea Gerard J. Kim, Korea Univ., Korea Dongsik

More information

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Z.Y. Alpaslan, S.-C. Yeh, A.A. Rizzo, and A.A. Sawchuk University of Southern California, Integrated Media Systems

More information

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS BY SERAFIN BENTO MASTER OF SCIENCE in INFORMATION SYSTEMS Edmonton, Alberta September, 2015 ABSTRACT The popularity of software agents demands for more comprehensive HAI design processes. The outcome of

More information

Lab 7: Introduction to Webots and Sensor Modeling

Lab 7: Introduction to Webots and Sensor Modeling Lab 7: Introduction to Webots and Sensor Modeling This laboratory requires the following software: Webots simulator C development tools (gcc, make, etc.) The laboratory duration is approximately two hours.

More information

Waves Nx VIRTUAL REALITY AUDIO

Waves Nx VIRTUAL REALITY AUDIO Waves Nx VIRTUAL REALITY AUDIO WAVES VIRTUAL REALITY AUDIO THE FUTURE OF AUDIO REPRODUCTION AND CREATION Today s entertainment is on a mission to recreate the real world. Just as VR makes us feel like

More information

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc. Human Vision and Human-Computer Interaction Much content from Jeff Johnson, UI Wizards, Inc. are these guidelines grounded in perceptual psychology and how can we apply them intelligently? Mach bands:

More information

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Optical Marionette: Graphical Manipulation of Human s Walking Direction Optical Marionette: Graphical Manipulation of Human s Walking Direction Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai (Digital Nature Group, University

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

CSE 190: 3D User Interaction

CSE 190: 3D User Interaction Winter 2013 CSE 190: 3D User Interaction Lecture #4: Displays Jürgen P. Schulze, Ph.D. CSE190 3DUI - Winter 2013 Announcements TA: Sidarth Vijay, available immediately Office/lab hours: tbd, check web

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Oculus Rift Getting Started Guide

Oculus Rift Getting Started Guide Oculus Rift Getting Started Guide Version 1.7.0 2 Introduction Oculus Rift Copyrights and Trademarks 2017 Oculus VR, LLC. All Rights Reserved. OCULUS VR, OCULUS, and RIFT are trademarks of Oculus VR, LLC.

More information

House Design Tutorial

House Design Tutorial House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When you are finished, you will have created a

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

Behavioural Realism as a metric of Presence

Behavioural Realism as a metric of Presence Behavioural Realism as a metric of Presence (1) Jonathan Freeman jfreem@essex.ac.uk 01206 873786 01206 873590 (2) Department of Psychology, University of Essex, Wivenhoe Park, Colchester, Essex, CO4 3SQ,

More information

Air-filled type Immersive Projection Display

Air-filled type Immersive Projection Display Air-filled type Immersive Projection Display Wataru HASHIMOTO Faculty of Information Science and Technology, Osaka Institute of Technology, 1-79-1, Kitayama, Hirakata, Osaka 573-0196, Japan whashimo@is.oit.ac.jp

More information

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Ernesto Arroyo MIT Media Laboratory 20 Ames Street E15-313 Cambridge, MA 02139 USA earroyo@media.mit.edu Ted Selker MIT Media Laboratory

More information

One Size Doesn't Fit All Aligning VR Environments to Workflows

One Size Doesn't Fit All Aligning VR Environments to Workflows One Size Doesn't Fit All Aligning VR Environments to Workflows PRESENTATION TITLE DATE GOES HERE By Show of Hands Who frequently uses a VR system? By Show of Hands Immersive System? Head Mounted Display?

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Physical Presence in Virtual Worlds using PhysX

Physical Presence in Virtual Worlds using PhysX Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are

More information

House Design Tutorial

House Design Tutorial Chapter 2: House Design Tutorial This House Design Tutorial shows you how to get started on a design project. The tutorials that follow continue with the same plan. When we are finished, we will have created

More information

Pedestrian Navigation System Using. Shoe-mounted INS. By Yan Li. A thesis submitted for the degree of Master of Engineering (Research)

Pedestrian Navigation System Using. Shoe-mounted INS. By Yan Li. A thesis submitted for the degree of Master of Engineering (Research) Pedestrian Navigation System Using Shoe-mounted INS By Yan Li A thesis submitted for the degree of Master of Engineering (Research) Faculty of Engineering and Information Technology University of Technology,

More information

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University HMD based VR Service Framework July 31 2017 Web3D Consortium Kwan-Hee Yoo Chungbuk National University khyoo@chungbuk.ac.kr What is Virtual Reality? Making an electronic world seem real and interactive

More information

AN ARCHITECTURE-BASED MODEL FOR UNDERGROUND SPACE EVACUATION SIMULATION

AN ARCHITECTURE-BASED MODEL FOR UNDERGROUND SPACE EVACUATION SIMULATION AN ARCHITECTURE-BASED MODEL FOR UNDERGROUND SPACE EVACUATION SIMULATION Chengyu Sun Bauke de Vries College of Architecture and Urban Planning Faculty of Architecture, Building and Planning Tongji University

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

Driving Simulators for Commercial Truck Drivers - Humans in the Loop

Driving Simulators for Commercial Truck Drivers - Humans in the Loop University of Iowa Iowa Research Online Driving Assessment Conference 2005 Driving Assessment Conference Jun 29th, 12:00 AM Driving Simulators for Commercial Truck Drivers - Humans in the Loop Talleah

More information