This is the peer reviewed author accepted manuscript (post print) version of a published work that appeared in final form in:

Size: px
Start display at page:

Download "This is the peer reviewed author accepted manuscript (post print) version of a published work that appeared in final form in:"

Transcription

1 A study in virtual navigation cues for forklift operators This is the peer reviewed author accepted manuscript (post print) version of a published work that appeared in final form in: Pereira, Alexandre, Lee, Gun A, Almeida, Edson & Billinghurst, Mark 2016 'A study in virtual navigation cues for forklift operators' Proceedings of the 18th Symposium on Virtual and Augmented Reality, SVR 2016, article no , pp This un-copyedited output may not exactly replicate the final published authoritative version for which the publisher owns copyright. It is not the copy of record. This output may be used for noncommercial purposes. The final definitive published version (version of record) is available at: Persistent link to the Research Outputs Repository record: General Rights: Copyright and moral rights for the publications made accessible in the Research Outputs Repository are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognize and abide by the legal requirements associated with these rights. Users may download and print one copy for the purpose of private study or research. You may not further distribute the material or use it for any profit-making activity or commercial gain You may freely distribute the persistent link identifying the publication in the Research Outputs Repository If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

2 A Study in Virtual Navigation Cues for Forklift Operators Alexandre Pereira * IFCE Fortaleza Brazil Gun A. Lee HIT Lab NZ University of Canterbury Christchurch, New Zealand Mark Billinghurst School of ITMS University of South Australia Adelaide, Australia ABSTRACT Augmented Reality (AR) is a technology that can overlap virtual elements over the real world in real time. This research focuses on studying how different AR elements can help forklift operators locate pallets as quickly as possible in a warehouse environment. We have developed a simulated AR environment to test Egocentric or Exocentric virtual navigation cues. The virtual elements were displayed to the user in a HUD (head-up display) on the forklift windshield, fixed place in front of the user operator, or in a HMD (head-mounted display), where the virtual cues are attached to the head of the user. A user study found that the Egocentric AR view was preferred over the Exocentric condition and performed better while the HUD and HMD viewing methods produce no difference in performance. Keywords: Augmented Reality, logistics, forklift, navigation. Index Terms: H.5.2 User Interfaces; H.5.1 Multimedia Information Systems - Artificial, augmented, and virtual realities. 1 INTRODUCTION Augmented Reality (AR) is technology that aims to seamlessly blend virtual information with the real world [1]. Researchers have shown that AR can be applied in numerous applications in education, engineering and entertainment, and other domains. For example, doctors can use AR to see medical data inside the patient body [8] and architects can see unfinished buildings [13]. In this paper, we explore how AR could be used to improve the performance of forklift operators in a warehouse environment. Researchers have previously shown that AR can improve objectpicking performance [10], vehicle navigation [6], and training tasks [17]. Based on this research, AR should be able to help forklift operators by assisting with the following: 1.Pallet location: AR cues could visually identify a target pallet location in a warehouse. 2.Navigation: AR cues could overlay directional instructions to guide the driver through the warehouse. 3.Slot location: AR cues could highlight an empty slot in the warehouse where boxes could be placed. Figure 1 shows the initial concept for how Augmented Reality could be used to accomplish each of these three functions. For example, virtual boxes could be used to highlight pallet pick-up location or empty slots, while navigation arrows could appear on the ground to guide the operator to a task location. * pereira.alexandremagno@gmail.com gun.lee@hitlabnz.org mark.billinghurst@unisa.edu.au a) Pallet Location b) Navigation c) Slot Location Figure 1: Artists Concept of AR use in a Warehouse. 2 RELATED WORK There has been previous research in using wearable computers and AR for stock picking and AR for vehicle navigation that is relevant. In this section, we review key work in these areas. 2.1 Order Picking Driving a forklift around a warehouse and moving goods is part of the order picking process. Depending on the industry, logistics costs amount to 5 to 8 % of revenue [12] and order picking accounts for 55% to 65% of the total operational costs of a warehouse [2]. Therefore reducing picking costs and improving performance could result in significantly lower warehouse costs. The order picking process consists of two navigation phases: a) finding a path to the right shelf and b) picking an object out of the shelf location [14]. It is estimated that 50% of a picker s time is spent traveling from one storage rack to the next, and 35% of time locating and picking from the correct bin [11]. As shown in figure 1, AR could be used to support navigation to the shelf location and then indicating which object on a shelf should be picked. Improvements in task time for either of these areas could result in significant performance improvement. 2.2 AR For Stock Picking A number of research groups have explored if AR and wearable technology can benefit stock picking operations. In [4] researchers evaluated order picking assisted by four approaches: heads-up display (HUD); cart-mounted display (CMD); pick-by-light; and paper pick list. They found that using the HUD and CMD both produced significantly faster pick times than using either the paper-based system (by 20%) or the Pick-by-Light method (by 40%). There was also an 80% reduction in errors using HUD and CMD compared to the other two methods. They report that pickby-hud and pick by-cmd were superior on all metrics to the current practices of pick-by-paper and Pick-by-Light, but there was no significant difference between the two. In a similar experiment [16] the researchers tested the use of a head-mounted display (HMD) based picking chart over a traditional text-based pick list, a paper-based graphical pick chart, and a mobile Pick-by-Voice system. The HMD condition was significantly faster (by 10% - 40%) than the average time per task when using any of the other methods. Iben et al. [5] compared a text-based paper pick list to a pick list rendered on a HMD, using additional context information. They found a similar performance benefit, with the HMD users taking an average of 5.3 seconds per object pick and the paper user requiring 6.3 seconds, and the HMD users making 40% fewer errors than the paper users.

3 Overall, these results show that using an HMD with a 2D graphical interface can produce a significant improvement in stock picking applications. This is because the bin picking information is always in view and the operator s hands are free to perform the picking task. It seems that AR interfaces in a HMD may not provide a significant benefit due to the difficulty of locating target bins on a small field of view display and making the virtual cues appear precisely overlaid on the real world. 2.3 AR for Vehicle Navigation A final area of related work is using AR cues to improve vehicle navigation. Although most of this research has been conducted on cars or trucks driving outdoors, the lessons learned could be applied to forklifts driving in an indoor warehouse. Research has been done on using the windshield as a HUD for vehicles, specifically to provide car navigation cues. For example, Tonnis et al. [15] created an interface that projects AR cues on the windshield that align with the road surface. Narzt et al. [7] go beyond this to not only show navigation arrows, but also cues highlighting freeway exits, points of interest such as petrol stations, and alerts to potential hazards such as pedestrians. Some research has compared AR to non-ar interfaces for car navigation and safety. For example, Park et al. [9] compared driver performance when using an AR HUD cues showing arrows on the road, to 2D icons on the HUD. They found that drivers had a significantly faster response time to lane changing information when shown with the AR cues. The University of California conducted a study with the windshield as a display and using AR elements to convey an alert when the driver exceeded the speed limit [3]. They also found that users had a faster reaction time using the HUD than when using an in-car display, due to the alert being easily visible in their field of view. 2.4 Summary In summary, previous research has shown that 2D interfaces on wearable displays can improve performance in stock picking, but AR interfaces on HMDs may not be as effective. However, using a vehicle HUD to show AR cues could improve navigation and responsiveness to alert information. This research implies that virtual cues presented in a HUD in a forklift could improve operator performance. However, there has been no previous work conducted to explore this, and especially not comparing performance with an AR HUD to an AR HMD interface. 3 PROTOTYPE DESIGN In order explore the use of AR for forklift operations we decided to develop a simulated AR interface in an immersive virtual environment. This was because of the difficulty of implementing an AR system on a real forklift due to technical challenges (e.g. limitations in vehicle tracking, AR display, etc) and being able to get access to a large group of forklift operators for testing. To develop a simulated AR experience, five key technological components were needed: (1) a simulated AR interface, (2) a 3D warehouse model, (3) support for a display device, (4) support for input devices and (5) vehicle control code. The simulation was developed in the Unity3D game engine 1, using a 3D model of a warehouse, a forklift and a virtual human operator, purchased from the TurboSquid website 2 (see figure 2). Figure 2: 3D models used in the simulation. We used the Unity NGUI plug-in 3 to enable easy development of user interface elements, and the NGUI HUD Text extension 4 to add text to the screen. Work had to be done to add shelving, lighting and other features to the simulated virtual warehouse. The main goal was to create a simulated AR interface attached to the virtual forklift model in the warehouse environment. This was done by adding a virtual camera view in the Unity3D simulation that was attached to the user s viewpoint and then including AR interface objects into this view. Figure 3a shows the Exocentric AR view from the operator's position and Figure 3b shows the Egocentric AR view. The key interface elements are: The Warehouse Map: a semitransparent map that shows a view from above the forklift and that follows it. Orientation Arrow: Virtual cue overlaid on the warehouse floor showing the path to a destination. Orientation Label: Indicates the box number to find in the warehouse (a) Exocentric Map View (b) Egocentric Arrow View Figure 3: Simulated AR interfaces. In the Egocentric view, the AR elements are directly displayed to the subject from a first person perspective. In the Exocentric view, the virtual navigation element is a map that shows a top down view from above of the forklift, including the arrow. The simulated AR interface was viewed in an Oculus Rift DK1 head mounted display that provided a 3D view and a fully immersive virtual reality experience. User input was captured from a Ferrari GT Thrustmaster steering wheel and pair of pedals and a software component was developed to simulate driving a forklift in response to the user input (see figure 4). Figure 4: A user operating the simulator

4 4 EXPERIMENTAL EVALUATION We conducted an experiment to compare using different AR elements for navigation inside a warehouse; comparing an AR HUD to an AR HMD interface in the forklift, and the use of an egocentric versus exocentric display for map information. 4.1 Experimental Design In the user study, the subject is supposed to find three boxes inside a warehouse using a different set of conditions without touching any of the boxes spread on the ground. The dependent variables in this experiment were: Box Time: The time needed to reach each target box Total Time: The sum of the time to reach all boxes Collisions: The number of times that the user collided with random boxes spread on the floor. The subjects completed the task in the following conditions: HUD: The AR display simulation is attached to the forklift windshield. If the user moves his head, the virtual cues will stay in the same position (figure 5). HMD: The AR elements are attached to the head of the subject. If the user operator rotates his head, the virtual cues will also rotate (figure 6). Egocentric: The AR elements are directly displayed to the subject from a first person view (figure 5 and 6). Exocentric: The AR elements are available inside a map that shows a view from above the forklift (figure 7). Figure 5: HUD with Egocentric cue condition. Figure 6: HMD with Egocentric cue condition. Figure 7: HMD with Exocentric cue condition. There were four test scenarios where three target boxes were placed at different locations in the warehouse, with a similar distance between them. The experimental task was to locate the three boxes in order using the AR navigation cues. With the four test scenarios and four interface conditions, there were sixteen combinations of these elements. In order to avoid the bias of learning effect, the order of the conditions for each subject was varied using a Latin Squares technique. To start, subjects were given a consent form and a demographics questionnaire. Next, they were told the experiment tasks and the differences between each test condition. Subjects were then placed in the virtual forklift simulator and taught how to drive until they were comfortable with the controls and could find a highlighted box and touched it with the forklift forks. Once the subjects understood how to control the vehicle, they were allowed to begin the experiment. When the experiment started, the subject had to follow the AR navigation cues to locate a highlighted box and touch it with at least one fork of the forklift. When the subject touched the first highlighted box, the AR navigation cues pointed to the second highlighted box. Then, when it was touched by one of the forks, the navigation cues pointed to the third highlighted box. Finally, when the user reached the third box, the time needed to reach each box, the total performance time, and the number of times that the subject hit boxes on the ground were recorded. Each subject had to do this task four times, once for each AR display condition, each time in a scenario where the target boxes were placed in different places. After the subjects finished each trial, they filled out a questionnaire about the usability and efficiency of the tested simulation condition. When the subjects finished all four tests, they were asked to make a ranking of the conditions and explain why they chose this particular order of preference. 4.2 Results In the results the Egocentric condition is referred as the letter G, Exocentric condition as the letter X, HUD condition is shown as U and HMD condition is referred to as M. There are four combinations of conditions that were tested in the experiment: MX: HMD interface with an Exocentric view. MG: HMD interface with an Egocentric view. UX: HUD interface with an Exocentric view. UG: HUD interface with an Egocentric view. There were 16 participants, 8 men and 8 women, with an average age of 23.4 years old User performance From the performance quantitative data we found: - There was no performance time difference between the different AR display conditions (HUD vs. HMD). - Users navigated significantly faster in the egocentric cue condition than the exocentric condition. - There was no significant difference in the number of boxes hit on the ground between the different conditions. Table 1 shows the task completion time in seconds for the subjects to complete navigating to all three boxes. A two-way repeated measure ANOVA test showed that there was no significant main effect of the HUD vs. HMD factor (F(1,15)=0.03, p=.87), but there was a significant main effect on the Egocentric vs. Exocentric factor (F(1,15)=4.98, p<.05). No significant interaction between the factors was found (F(1,15)=0.07, p=.80). Descriptive statistics show that participants took less time to complete the task in conditions with Egocentric condition (Mean= secs) compared to those with an Exocentric condition (Mean= secs). Table 1: Task completion time. Condition Mean Time (s) Std. Dev. MX MG UX UG

5 Table 2 shows the number of times that users collided with boxes on the ground as an indication of how accurately they were driving. A two-way repeated measure ANOVA showed that there was no significant main effect of the HUD vs. HMD factor (F(1,15)=1.15, p=.30), and no significant main effect of the Egocentric vs. Exocentric (F(1,15)=0.01, p=.93). No significant interaction between the factors was found (F(1,15)= 0.32, p=.58). Table 2: Number of hits on boxes on the ground. Condition Mean Std. Dev. MX MG UX UG Questionnaire: Rating After each condition, a questionnaire with the following ten rating questions was answered on a Likert scale from 1 (Strongly Disagree) to 7 (Strongly Agree): Q1. It was easy to navigate to the target boxes Q2. It was easy to learn how to use the virtual cues Q3. The navigation cues were useful to the task Q4. The navigation cue was intuitive Q5. The navigation cue was natural Q6. The navigation cue was effective Q7. The navigation cue was mentally stressful Q8. The navigation cue was physically stressful Q9. The navigation cue deviated my attention from the boxes on the ground Q10. I was able to drive the forklift well Table 3 shows the average results of the rating questions across all the conditions. The first 4 questions are about ease of use showed the mean values being much higher than the expected average (4.0), indicating that the subjects found that the navigation cues were actually helping them to complete the experimental tasks. Questions 7 and 8 ask about the stress of using the interface, with a mean much lower than the average, showing that the participants experienced little stress while doing the experiment. Table 3: Likert scale rating results with the Friedman test p-value. Quest. MX MG UX UG p-val While descriptive data suggests that the Egocentric conditions (UG and MG) had a better rating in most questions (e.g. the UG condition was the best rated in 9 out of 10 questions), a nonparametric Friedman test was conducted for each question but no significant difference was found Questionnaire: Ranking After participants finished all four conditions, they were asked to rank the order of preference of the conditions using the values from 1 (best condition) to 4 (worst condition). From the ranking quantitative data the following results were found: - Users prefer the Egocentric over the Exocentric condition. - There was no preference difference between the different AR display conditions (HUD vs. HMD). Table 4 shows the mean rank for each of the conditions. Friedman indicated there was a significant difference between the conditions with a Chi-square value of and p=.006. Descriptive statistics show participants ranked Egocentric conditions higher (Mean=1.91) compared to those with an Exocentric condition (Mean=3.09). Table 4: Ranking results. Ranking Condition Mean Rank MX 3.00 MG 1.94 UX 3.19 UG 1.88 In order to compare each combination of conditions in the experiment, post hoc tests were performed with Wilcoxon signedrank test with a Bonferroni correction applied. The results showed there was a significant difference between Egocentric and Exocentric conditions while no significant difference was found between the other possible conditions in the experiment. Table 5 shows the post hoc test results and the associated Z values. The test for Ego vs. Exo had a significant result, and UX vs. UG conditions also showed significant difference with the Bonferroni corrected a significant level set at p <.008. Other post hoc tests did not have a significant result. Table 5: Post hoc tests for ranking results. Test Condition Z-Value p-val. Ego vs. Exo * HMD vs. HUD MX vs. MG MX vs. UX MG vs. UG MG vs. UX UX vs. UG * 4.3 Qualitative Feedback Participants were asked to give qualitative feedback with the reasons for their rankings that they made at the end of the experiment. One of the main reasons for the Egocentric conditions being ranked the best was because the virtual arrows on the ground were bigger, clearer, and centered in front of the subjects view. One of the subjects summarized the predominant impression of the participants: "The arrows on the ground were the best due to being bigger and clearer so easy to follow. The map felt slightly disconnected from the task and took additional mental processing to read." The Exocentric conditions were ranked lower because they required additional mental processes to understand how to reach the highlighted boxes. Users had to mentally map their position and orientation in the virtual map to what they were seeing in front of them. The map view also blocked part of the user s view.

6 Participants tended to make more errors in the Exocentric conditions. Many times the participants entered a wrong aisle in the warehouse because they were concerned about not hitting boxes on the ground. When they tried to re-orient themselves, often they had already missed the best path and had to go another way. In the Egocentric conditions, this did not happen as they could see the virtual orientation arrows all the time. Between two Exocentric conditions, the participants preferred the one with the head-mounted display because they could continue to see the map even if they turned their heads. One of the participants said: "The HMD map, which was moved along with the head was better than a fixed map because I was able to see it even when I was looking somewhere else." 5 DISCUSSION Overall, the HUD interface with an Egocentric view (UG condition) had the best results. The HMD interface with an Egocentric view (MG condition) also performed well, but slightly below the UG condition results. For most of the measures, the HUD interface with an Exocentric view (UX condition) had the worst results in both task completion time and questionnaire results. The HMD interface with an Egocentric view (MX condition) also had poor results, but slightly better than the UX condition. The results clearly show that the Egocentric conditions had the best results in both the quantitative and qualitative data, compared to the Exocentric conditions. Many participants said that the Egocentric scenarios were better because they could see the virtual cues clearly and require less mental effort. Between the Egocentric combinations, the participants preferred the HUD condition. This may be due to the fact that having an AR cue that is always in front of the point of view of the subjects no matter the direction the user participants are looking could be a little distracting. There was no significant difference in the HUD and HMD condition in both task performance and questionnaire results. This may be related to the fact that users spent most of their time looking straight ahead, making these conditions very similar. 6 CONCLUSIONS AND FUTURE WORK In this project we developed a simulated AR interface for a forklift operator in a warehouse, performing stock picking and movement tasks. The main lessons learned are that users felt that AR cues helped their performance and that Egocentric cues were more useful than Exocentric cues. We did not see a significant difference between the HMD and HUD display conditions, but this was probably due to the nature of the tasks and is something that could be explored further. There are a number of ways that this research could be extended in the future. First, adding moving forklifts to the scene would enable exploring how AR could be used to help with situational awareness of the driver s surroundings. Next, it would be good to test the AR simulator with real forklift drivers, and investigate how AR technology can enhance existing skills, or enable novice operators to acquire skills in a short time. Finally, we should make a real version of the AR interface that could be installed in an actual forklift. Although valuable lessons can be learned from the AR simulation, there is also important feedback that can only be collected from people in their real work environment. [2] Bartholdi, J., and Hackmann, S. (2009) Warehouse and distribution science release 0.89; Georgia Institute of Technology, Tech. Rep., January Available Online: [3] Doshi, A., Cheng, S., and Trivedi, M. (2009). A novel active headsup display for driver assistance. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 39(1), [4] Guo, A., Raghu, S., Xie, X., Ismail, S., Luo, X., Simoneau, J.,... & Starner, T. (2014). A comparison of order picking assisted by headup display (HUD), cart-mounted display (CMD), light, and paper pick list. In Proceedings of the 2014 ACM International Symposium on Wearable Computers (pp ). ACM. [5] Iben, H., Baumann, H., Starner, T., Ruthenbeck, C., and Klug, T. (2009) Visual based picking supported by context awareness: Comparing picking performance using paper-based lists versus lists presented on a head mounted display with contextual support. In ICMI-MLMI, Cambridge,MA, USA, November ACM. [6] Kim, S., & Dey, A. K. (2009). Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp ). ACM. [7] Narzt, W., Pomberger, G., Ferscha, A., Kolb, D., Muller, R., Wieghardt, J., Hortner, H., and Lindinger, C. (2003) Pervasive information acquisition for mobile ar-navigation systems. In Fifth IEEE Workshop on Mobile Computing Systems and Applications, [8] Navab, N., Feuerstein, M., &Bichlmeier, C. (2007). Laparoscopic virtual mirror new interaction paradigm for monitor based augmented reality. In Virtual Reality Conference, VR'07. IEEE (pp ). [9] Park, K. S., Cho, I. H., Hong, G. B., Nam, T. J., Park, J., Cho, S. I., & Joo, I. H. (2007). Disposition of information entities and adequate level of information presentation in an in-car augmented reality navigation system. In Human Interface and the Management of Information. Interacting in Information Environments (pp ). Springer Berlin Heidelberg. [10] Reif, R., & Günthner, W. A. (2009). Pick-by-Vision: An Augmented Reality supported Picking System. [11] Schwerdtfeger, B., & Klinker, G. (2008). Supporting order picking with augmented reality. In Proceedings of the 7th IEEE/ACM international Symposium on Mixed and Augmented Reality (pp ). IEEE Computer Society. [12] Straube, F.; Pfohl, H.-C.: (2008) Trends und Strategien in der Logistik 2008: Globale Netzwerke im Wandel, DVV, 2008 Bremen [13] Thomas, B., Piekarski, W., & Gunther, B. (1999). Using augmented reality to visualise architecture designs in an outdoor environment. International Journal of Design Computing: Special Issue on Design Computing on the Net (dcnet'99), 2. [14] Tompkins, J., White, J., Bozer, Y., and Tanchoco, J. (2003) Facilities Planning, [15] Tonnis, M., Lange, C., and Klinker, G. (2007) Visual longitudinal and lateral driving assistance in the head-up display of cars. In Proceedings of the 6th International Symposium on Mixed and Augmented Reality (ISMAR), pages 91 94, [16] Weaver, K. A., Baumann, H., Starner, T., Iben, H., & Lawo, M. (2010). An empirical task analysis of warehouse order picking using head-mounted displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp ). ACM. [17] Westerfield, G., Mitrovic, A., & Billinghurst, M. (2015). Intelligent Augmented Reality Training for Motherboard Assembly. International Journal of Artificial Intelligence in Education, 25(1), REFERENCES [1] Azuma, R. (1997) A Survey of Augmented Reality Presence: Teleoperators and Virtual Environments, pp , August 1997.

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Evaluating the Augmented Reality Human-Robot Collaboration System

Evaluating the Augmented Reality Human-Robot Collaboration System Evaluating the Augmented Reality Human-Robot Collaboration System Scott A. Green *, J. Geoffrey Chase, XiaoQi Chen Department of Mechanical Engineering University of Canterbury, Christchurch, New Zealand

More information

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Arindam Dey PhD Student Magic Vision Lab University of South Australia Supervised by: Dr Christian Sandor and Prof.

More information

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive Author manuscript, published in "2012 IEEE Intelligent Vehicles Symposium, Spain (2012)" DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive Paul George, Indira Thouvenin, Vincent

More information

Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects

Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects Missie Smith 1 Nadejda Doutcheva 2 Joseph L. Gabbard 3 Gary Burnett 4 Human Factors Research Group University of Nottingham

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER DOWNLOAD EBOOK : AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

More information

Interactions and Applications for See- Through interfaces: Industrial application examples

Interactions and Applications for See- Through interfaces: Industrial application examples Interactions and Applications for See- Through interfaces: Industrial application examples Markus Wallmyr Maximatecc Fyrisborgsgatan 4 754 50 Uppsala, SWEDEN Markus.wallmyr@maximatecc.com Abstract Could

More information

Augmented reality for machinery systems design and development

Augmented reality for machinery systems design and development Published in: J. Pokojski et al. (eds.), New World Situation: New Directions in Concurrent Engineering, Springer-Verlag London, 2010, pp. 79-86 Augmented reality for machinery systems design and development

More information

Published in: Proceedings of the Workshop on What to Study in HCI at CHI 2015 Conference on Human Factors in Computing Systems

Published in: Proceedings of the Workshop on What to Study in HCI at CHI 2015 Conference on Human Factors in Computing Systems Aalborg Universitet What to Study in HCI Kjeldskov, Jesper; Skov, Mikael; Paay, Jeni Published in: Proceedings of the Workshop on What to Study in HCI at CHI 2015 Conference on Human Factors in Computing

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface

Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface Scott A. Green*, **, XioaQi Chen*, Mark Billinghurst** J. Geoffrey Chase* *Department of Mechanical Engineering, University

More information

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry USTGlobal VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry UST Global Inc, August 2017 Table of Contents Introduction 3 Focus on Shopping Experience 3 What we can do at UST Global 4

More information

Mixed Reality technology applied research on railway sector

Mixed Reality technology applied research on railway sector Mixed Reality technology applied research on railway sector Yong-Soo Song, Train Control Communication Lab, Korea Railroad Research Institute Uiwang si, Korea e-mail: adair@krri.re.kr Jong-Hyun Back, Train

More information

How Many Pixels Do We Need to See Things?

How Many Pixels Do We Need to See Things? How Many Pixels Do We Need to See Things? Yang Cai Human-Computer Interaction Institute, School of Computer Science, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA ycai@cmu.edu

More information

Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment

Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment Ionut Damian Human Centered Multimedia Augsburg University damian@hcm-lab.de Felix Kistler Human Centered

More information

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM University of Iowa Iowa Research Online Driving Assessment Conference 2007 Driving Assessment Conference Jul 11th, 12:00 AM Safety Related Misconceptions and Self-Reported BehavioralAdaptations Associated

More information

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine

Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Show me the direction how accurate does it have to be? Magnusson, Charlotte; Rassmus-Gröhn, Kirsten; Szymczak, Delphine Published: 2010-01-01 Link to publication Citation for published version (APA): Magnusson,

More information

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments Weidong Huang 1, Leila Alem 1, and Franco Tecchia 2 1 CSIRO, Australia 2 PERCRO - Scuola Superiore Sant Anna, Italy {Tony.Huang,Leila.Alem}@csiro.au,

More information

Future Directions for Augmented Reality. Mark Billinghurst

Future Directions for Augmented Reality. Mark Billinghurst Future Directions for Augmented Reality Mark Billinghurst 1968 Sutherland/Sproull s HMD https://www.youtube.com/watch?v=ntwzxgprxag Star Wars - 1977 Augmented Reality Combines Real and Virtual Images Both

More information

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1750 Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers Prerana Rane 1, Hyungil Kim 2, Juan Lopez Marcano 1,

More information

THE FUTURE OF AUTOMOTIVE - AUGMENTED REALITY VERSUS AUTONOMOUS VEHICLES

THE FUTURE OF AUTOMOTIVE - AUGMENTED REALITY VERSUS AUTONOMOUS VEHICLES The 14 International Conference RELIABILITY and STATISTICS in TRANSPORTATION and COMMUNICATION 2014 Proceedings of the 14th International Conference Reliability and Statistics in Transportation and Communication

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Javier Jiménez Alemán Fluminense Federal University, Niterói, Brazil jjimenezaleman@ic.uff.br Abstract. Ambient Assisted

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation Minghao Cai and Jiro Tanaka Graduate School of Information, Production and Systems Waseda University Kitakyushu, Japan Email: mhcai@toki.waseda.jp,

More information

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile Paul George, Indira Thouvenin, Vincent Fremont, Véronique Cherfaoui To cite this version: Paul George, Indira Thouvenin, Vincent

More information

Early Take-Over Preparation in Stereoscopic 3D

Early Take-Over Preparation in Stereoscopic 3D Adjunct Proceedings of the 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 18), September 23 25, 2018, Toronto, Canada. Early Take-Over

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Comparison of Single-Wall Versus Multi-Wall Immersive Environments to Support a Virtual Shopping Experience

Comparison of Single-Wall Versus Multi-Wall Immersive Environments to Support a Virtual Shopping Experience Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 6-2011 Comparison of Single-Wall Versus Multi-Wall Immersive Environments to Support a Virtual Shopping Experience

More information

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR HCI and Design Admin Reminder: Assignment 4 Due Thursday before class Questions? Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR 3D Interfaces We

More information

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters University of Iowa Iowa Research Online Driving Assessment Conference 2017 Driving Assessment Conference Jun 28th, 12:00 AM Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected

More information

Simulated Augmented Reality Windshield Display as a Cognitive Mapping Aid for Elder Driver Navigation

Simulated Augmented Reality Windshield Display as a Cognitive Mapping Aid for Elder Driver Navigation Simulated Augmented Reality Windshield Display as a Cognitive Mapping Aid for Elder Driver Navigation SeungJun Kim Anind K. Dey Human-Computer Interaction Institute Carnegie Mellon University Pittsburgh

More information

immersive visualization workflow

immersive visualization workflow 5 essential benefits of a BIM to immersive visualization workflow EBOOK 1 Building Information Modeling (BIM) has transformed the way architects design buildings. Information-rich 3D models allow architects

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu

More information

Building Spatial Experiences in the Automotive Industry

Building Spatial Experiences in the Automotive Industry Building Spatial Experiences in the Automotive Industry i-know Data-driven Business Conference Franz Weghofer franz.weghofer@magna.com Video Agenda Digital Factory - Data Backbone of all Virtual Representations

More information

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D.

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. CSE 190: 3D User Interaction Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D. 2 Announcements Final Exam Tuesday, March 19 th, 11:30am-2:30pm, CSE 2154 Sid s office hours in lab 260 this week CAPE

More information

Image Characteristics and Their Effect on Driving Simulator Validity

Image Characteristics and Their Effect on Driving Simulator Validity University of Iowa Iowa Research Online Driving Assessment Conference 2001 Driving Assessment Conference Aug 16th, 12:00 AM Image Characteristics and Their Effect on Driving Simulator Validity Hamish Jamson

More information

Multimodal Metric Study for Human-Robot Collaboration

Multimodal Metric Study for Human-Robot Collaboration Multimodal Metric Study for Human-Robot Collaboration Scott A. Green s.a.green@lmco.com Scott M. Richardson scott.m.richardson@lmco.com Randy J. Stiles randy.stiles@lmco.com Lockheed Martin Space Systems

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Virtual/Augmented Reality (VR/AR) 101

Virtual/Augmented Reality (VR/AR) 101 Virtual/Augmented Reality (VR/AR) 101 Dr. Judy M. Vance Virtual Reality Applications Center (VRAC) Mechanical Engineering Department Iowa State University Ames, IA Virtual Reality Virtual Reality Virtual

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space

Chapter 2 Understanding and Conceptualizing Interaction. Anna Loparev Intro HCI University of Rochester 01/29/2013. Problem space Chapter 2 Understanding and Conceptualizing Interaction Anna Loparev Intro HCI University of Rochester 01/29/2013 1 Problem space Concepts and facts relevant to the problem Users Current UX Technology

More information

Running an HCI Experiment in Multiple Parallel Universes

Running an HCI Experiment in Multiple Parallel Universes Author manuscript, published in "ACM CHI Conference on Human Factors in Computing Systems (alt.chi) (2014)" Running an HCI Experiment in Multiple Parallel Universes Univ. Paris Sud, CNRS, Univ. Paris Sud,

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Differences in Fitts Law Task Performance Based on Environment Scaling

Differences in Fitts Law Task Performance Based on Environment Scaling Differences in Fitts Law Task Performance Based on Environment Scaling Gregory S. Lee and Bhavani Thuraisingham Department of Computer Science University of Texas at Dallas 800 West Campbell Road Richardson,

More information

Mobile Information Presentation Schemes for Supra-Adaptive Logistics Applications

Mobile Information Presentation Schemes for Supra-Adaptive Logistics Applications Mobile Information Presentation Schemes for Supra-Adaptive Logistics Applications Björn Schwerdtfeger, Troels Frimor, Daniel Pustka, and Gudrun Klinker Fachgebiet Augmented Reality, Department of Informatics,

More information

QS Spiral: Visualizing Periodic Quantified Self Data

QS Spiral: Visualizing Periodic Quantified Self Data Downloaded from orbit.dtu.dk on: May 12, 2018 QS Spiral: Visualizing Periodic Quantified Self Data Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann Published in: Proceedings of CHI 2013 Workshop

More information

LED NAVIGATION SYSTEM

LED NAVIGATION SYSTEM Zachary Cook Zrz3@unh.edu Adam Downey ata29@unh.edu LED NAVIGATION SYSTEM Aaron Lecomte Aaron.Lecomte@unh.edu Meredith Swanson maw234@unh.edu UNIVERSITY OF NEW HAMPSHIRE DURHAM, NH Tina Tomazewski tqq2@unh.edu

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 2093 Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Hyungil Kim, Jessica D.

More information

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 Abstract Navigation is an essential part of many military and civilian

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

The application of Work Domain Analysis (WDA) for the development of vehicle control display

The application of Work Domain Analysis (WDA) for the development of vehicle control display Proceedings of the 7th WSEAS International Conference on Applied Informatics and Communications, Athens, Greece, August 24-26, 2007 160 The application of Work Domain Analysis (WDA) for the development

More information

Implementation of Image processing using augmented reality

Implementation of Image processing using augmented reality Implementation of Image processing using augmented reality Konjengbam Jackichand Singh 1, L.P.Saikia 2 1 MTech Computer Sc & Engg, Assam Downtown University, India 2 Professor, Computer Sc& Engg, Assam

More information

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION CHYI-GANG KUO, HSUAN-CHENG LIN, YANG-TING SHEN, TAY-SHENG JENG Information Architecture Lab Department of Architecture National Cheng Kung University

More information

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach Proceedings of the 6 th Humanist Conference, The Hague, Netherlands, 13-14 June 2018 Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach Lucas Morillo Méndez, CTAG, Spain, l.morillo.lm@gmail.com,

More information

Study of the touchpad interface to manipulate AR objects

Study of the touchpad interface to manipulate AR objects Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for

More information

VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT

VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT 3-59 Corbett Hall University of Alberta Edmonton, AB T6G 2G4 Ph: (780) 492-5422 Fx: (780) 492-1696 Email: atlab@ualberta.ca VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT Mengliao

More information

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays

Einführung in die Erweiterte Realität. 5. Head-Mounted Displays Einführung in die Erweiterte Realität 5. Head-Mounted Displays Prof. Gudrun Klinker, Ph.D. Institut für Informatik,Technische Universität München klinker@in.tum.de Nov 30, 2004 Agenda 1. Technological

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Capability for Collision Avoidance of Different User Avatars in Virtual Reality Capability for Collision Avoidance of Different User Avatars in Virtual Reality Adrian H. Hoppe, Roland Reeb, Florian van de Camp, and Rainer Stiefelhagen Karlsruhe Institute of Technology (KIT) {adrian.hoppe,rainer.stiefelhagen}@kit.edu,

More information

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control Hyun-sang Cho, Jayoung Goo, Dongjun Suh, Kyoung Shin Park, and Minsoo Hahn Digital Media Laboratory, Information and Communications

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Effects of Environmental Clutter and Motion on User Performance in Virtual Reality Games

Effects of Environmental Clutter and Motion on User Performance in Virtual Reality Games Effects of Environmental Clutter and Motion on User Performance in Virtual Reality Games Lal Bozgeyikli University of South Florida Tampa, FL 33620, USA gamze@mail.usf.edu Andrew Raij University of Central

More information

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1

OCULUS VR, LLC. Oculus User Guide Runtime Version Rev. 1 OCULUS VR, LLC Oculus User Guide Runtime Version 0.4.0 Rev. 1 Date: July 23, 2014 2014 Oculus VR, LLC All rights reserved. Oculus VR, LLC Irvine, CA Except as otherwise permitted by Oculus VR, LLC, this

More information

HAPTICS AND AUTOMOTIVE HMI

HAPTICS AND AUTOMOTIVE HMI HAPTICS AND AUTOMOTIVE HMI Technology and trends report January 2018 EXECUTIVE SUMMARY The automotive industry is on the cusp of a perfect storm of trends driving radical design change. Mary Barra (CEO

More information

Motion sickness issues in VR content

Motion sickness issues in VR content Motion sickness issues in VR content Beom-Ryeol LEE, Wookho SON CG/Vision Technology Research Group Electronics Telecommunications Research Institutes Compliance with IEEE Standards Policies and Procedures

More information

Journal of Physics: Conference Series PAPER OPEN ACCESS. To cite this article: Lijun Jiang et al 2018 J. Phys.: Conf. Ser.

Journal of Physics: Conference Series PAPER OPEN ACCESS. To cite this article: Lijun Jiang et al 2018 J. Phys.: Conf. Ser. Journal of Physics: Conference Series PAPER OPEN ACCESS The Development of A Potential Head-Up Display Interface Graphic Visual Design Framework for Driving Safety by Consuming Less Cognitive Resource

More information

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Test of pan and zoom tools in visual and non-visual audio haptic environments Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten Published in: ENACTIVE 07 2007 Link to publication Citation

More information

Virtual and Augmented Reality for Cabin Crew Training: Practical Applications

Virtual and Augmented Reality for Cabin Crew Training: Practical Applications EATS 2018: the 17th European Airline Training Symposium Virtual and Augmented Reality for Cabin Crew Training: Practical Applications Luca Chittaro Human-Computer Interaction Lab Department of Mathematics,

More information

Exploring Virtual Depth for Automotive Instrument Cluster Concepts

Exploring Virtual Depth for Automotive Instrument Cluster Concepts Exploring Virtual Depth for Automotive Instrument Cluster Concepts Nora Broy 1,2,3, Benedikt Zierer 2, Stefan Schneegass 3, Florian Alt 2 1 BMW Research and Technology Nora.NB.Broy@bmw.de 2 Group for Media

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

Objective Data Analysis for a PDA-Based Human-Robotic Interface* Objective Data Analysis for a PDA-Based Human-Robotic Interface* Hande Kaymaz Keskinpala EECS Department Vanderbilt University Nashville, TN USA hande.kaymaz@vanderbilt.edu Abstract - This paper describes

More information

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications Bluetooth Low Energy Sensing Technology for Proximity Construction Applications JeeWoong Park School of Civil and Environmental Engineering, Georgia Institute of Technology, 790 Atlantic Dr. N.W., Atlanta,

More information

How Representation of Game Information Affects Player Performance

How Representation of Game Information Affects Player Performance How Representation of Game Information Affects Player Performance Matthew Paul Bryan June 2018 Senior Project Computer Science Department California Polytechnic State University Table of Contents Abstract

More information

Air-filled type Immersive Projection Display

Air-filled type Immersive Projection Display Air-filled type Immersive Projection Display Wataru HASHIMOTO Faculty of Information Science and Technology, Osaka Institute of Technology, 1-79-1, Kitayama, Hirakata, Osaka 573-0196, Japan whashimo@is.oit.ac.jp

More information

Open Research Online The Open University s repository of research publications and other research outputs

Open Research Online The Open University s repository of research publications and other research outputs Open Research Online The Open University s repository of research publications and other research outputs MusicJacket: the efficacy of real-time vibrotactile feedback for learning to play the violin Conference

More information

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1 Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology

More information

Gaze informed View Management in Mobile Augmented Reality

Gaze informed View Management in Mobile Augmented Reality Gaze informed View Management in Mobile Augmented Reality Ann M. McNamara Department of Visualization Texas A&M University College Station, TX 77843 USA ann@viz.tamu.edu Abstract Augmented Reality (AR)

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Volume 117 No. 22 2017, 209-213 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Mrs.S.Hemamalini

More information

Adapting SatNav to Meet the Demands of Future Automated Vehicles

Adapting SatNav to Meet the Demands of Future Automated Vehicles Beattie, David and Baillie, Lynne and Halvey, Martin and McCall, Roderick (2015) Adapting SatNav to meet the demands of future automated vehicles. In: CHI 2015 Workshop on Experiencing Autonomous Vehicles:

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy Michael Saenz Texas A&M University 401 Joe Routt Boulevard College Station, TX 77843 msaenz015@gmail.com Kelly Maset Texas A&M University

More information

VR Haptic Interfaces for Teleoperation : an Evaluation Study

VR Haptic Interfaces for Teleoperation : an Evaluation Study VR Haptic Interfaces for Teleoperation : an Evaluation Study Renaud Ott, Mario Gutiérrez, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory Ecole Polytechnique Fédérale de Lausanne (EPFL) CH-1015

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

Augmented Navigation Patricia Sazama, Xuetong Sun, Derek Juba, and Amitabh Varshney

Augmented Navigation Patricia Sazama, Xuetong Sun, Derek Juba, and Amitabh Varshney Patricia Sazama, Xuetong Sun, Derek Juba, and Amitabh Varshney Two Domains Augmented Driving (and walking) Augmented Surgery Augmented Driving Problem Augment what we can see while driving with additional

More information

Perspective of Reality

Perspective of Reality Perspective of Reality [1] Ch. Aishwarya, [2] R. Sai Sravya, [3] P. Siva Parvathi [1][2][3] Department of Computer Science and Engineering. G. Narayanamma Institute of Science and Technology (for Women)

More information

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities Sylvia Rothe 1, Mario Montagud 2, Christian Mai 1, Daniel Buschek 1 and Heinrich Hußmann 1 1 Ludwig Maximilian University of Munich,

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display SUK WON LEE, TAEK SU NAM, ROHAE MYUNG Division of Information Management Engineering Korea University 5-Ga, Anam-Dong,

More information