Study of the touchpad interface to manipulate AR objects

Size: px
Start display at page:

Download "Study of the touchpad interface to manipulate AR objects"

Transcription

1 Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for AR objects on the touchpad, we propose the relative method with grasping the touchpad with still looking at the real space in which AR object exist. Specifically, a pointer is displayed in the AR space and it is possible to manipulate an AR object by operating the pointer by treating the amount as the relative amount of the touchpad grasped in hand. In the same way, we believe that there is a scene which they should treat the amount as the absolute amount of the touchpad is suitable for AR environment. In the case of treating the amount as the absolute amount, it is hard to operate with looking at the real space in which AR objects exist. Then, by overlaying the miniature of the AR objects, it is possible to operate. Due to this, the user can manipulate an AR object out of reach precisely In this paper, we study the operability in the case performing an operation with looking at the real space in which AR objects exist and operation with looking at the touchpad. As a result, in the character as selection task, the absolute method with looking at the touchpad is sooner than the relative method with looking at the real space in which AR objects exist. In the alignment task, there is no difference in completion time either method. Keywords: Augmented Reality, Remote Manipulation, Touchpad, HMD. Index Terms: D.2.2 [Design Tools and Techniques]: User interfaces ; H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities 1. INTRODUCTION These days, AR is widely used in various fields. By overlaying seamlessly an AR object to the user's view, the user is able to obtain the various information without causing excessive eye movement. In the case of interacting with AR objects that are superimposed on a real-world environment, by working with watching the real space in which AR object exist, reducing the vision shift amount and maintain high immersion. Thus, in previous research, by looking at the AR space through such as smartphones and tablets [1, 2, 3], it is possible to operate directory. Wearing the HMD, it is possible to associate seamlessly AR space and real space. In addition, using glove type device, it is possible to operate AR objects in the same manner as real-world objects. In the case of AR objects are positioned out of reach, using the extension method of arm [4, 5], *1 nagashima@nishilab.sys.es.osaka-u.ac.jp *2 sakata@nishilab.sys.es.osaka-u.ac.jp *3 nishida@nishilab.sys.es.osaka-u.ac.jp it becomes possible to operate. However, there are several problems such as reducing the flexibility and accuracy. Then, we propose the relative method for quick operation can be accurately an AR object with grasping the touchpad with still looking at the real space in which AR object exist. Specifically, a pointer is displayed in the AR space and it is possible to manipulate an AR object by operating the pointer by treating the amount as the relative amount of the touchpad grasped in hand. In the same way, we believe that there is a scene which they should treat the amount as the absolute amount of the touchpad is suitable for AR environment. In the case of treating the amount as the absolute amount, it is hard to operate with looking at the real space in which AR objects exist. Then, by overlaying the miniature of the AR objects, it is possible to operate. Due to this, the user can manipulate an AR object out of reach. In this paper, we study the operability in the case performing an operation with looking at the real space in which AR objects exist and operation with looking at the touchpad. 2. RELATED WORKS As to the region of the VR and AR, we show the interface to manipulate objects in the virtual space and studying the method in pointing VR/AR INTERFACES We enumerate the interfaces that refer to manipulating virtual objects not only AR. MagicCup [6] is the cup type device, and Tinmith [7, 8, 9, 10, 11] is the glove type device. In the case of using these devices in AR, it is possible to manipulate an AR object in hand range accurately, but in the case of AR object is not in hand range, precision worsens remarkably. Schmalstieg, et al. [12] and Szalavári, et al. [13] are the interface to manipulate a virtual object by defining the plane in the virtual space, and using the plane like touch-panel. TouchMe [2] is an interface utilizing the touch-panel, and [3, 1] are the interface utilizing the mobile phone. These interfaces are low flexibility because not utilizing the HMD. Rahul, et al. [14] were referred to the various method in the system of collaborating with handheld touch-panel and HMD. A Touring Machine [15] is the interface pasting annotations to real landscape. However this did not mention three-dimensional. World in Miniature (WIM) [16, 17] is the method a real object in the virtual object by associating a real object. In MRTent [18], a limit such as not being able to float an AR object in the air occurs because of physical limitation INPUT METHOD We enumerate the interfaces that refer to relative and absolute method. ARC-Pad [19] is the pointing interface manipulating the mouse cursor of PC by utilizing the smart phone. HybridPointing [20, 21] is the pointing interface for the giant screen. Smith, et al. [22] were referred to relative/absolute method for touch-panel interaction with virtual object in VR space. There is not the adaptation example to any AR.

2 3. COLLABORATED WITH TOUCHPAD AND HMD We developed an interface that combines the HMD and touchpad. Figure 1 (a) shows the appearance of the system. Video-See- Though AR was created by attaching the monocular head-mounted camera (HMC) between the eyes of the HMD (Figure 1 (b)). AR objects are superimposed on the real-time images by captured from the HMC. The field of view of the attached camera is much wider than the HMD, and so any objects displayed on the HMD become smaller when viewed by the HMC. Consequentially, by trimming and using the center part of the captured image of the camera, we can provide reasonable viewing via the HMD as naturally as possible. We use the ArUco which based on OpenCV for setting the coordinate system on the touchpad and landscape. By recognizing the visual markers surrounding the touchpad and the visual markers on the landscape, the place where coordinate systems are placed are decided. Attaching the many markers (Figure 1(c)) on the frame of the touchpad, we realized the robust tracking of the touchpad surface with considering occlusion by hand in the case of manipulating an AR object with watching the touchpad. It is possible to manipulate quickly and accurately AR object even if away from the user, as shown in Figure 1(d), by displaying the miniatures of AR objects on the touchpad. By displaying the pointer on the landscape, it will be able to manipulate with watching landscape as well. The scope described in landscape is a view-area on the touchpad, and it will be understood the direction to move the pointer for the user. If the pointer overlaps an AR object or the user touches an AR with watching the touchpad, the touchpad is vibrated and clap the sound. (a) (b) Figure 2: Input method HMDandHMC SonyHMZ-T2 -Resolutionis720p. -Horizontalangleofviewis77degrees. Touchpad (a) Appearance of the system SonyPlayStationeye -Resolutionis640x480 -Framerateis60fps. -Angleofviewis75degrees. (b) HMD and HMC Openthekeyboard button Pointer Scope AsusNexus7-7-inch(180mm)diagonalIPSLCD -10multi-touchpoints ARobjects ARobjects MiniaturesofARobjects (c) Touchpad (d) AR objects and miniatures of AR objects Figure 1: Concept of the system

3 4. INPUT METHOD FOR TOUCHPAD IN AR ENVIRONMENT There are two methods, relative and absolute method, as the method of the touchpad. As shown in Figure 1 (a), in relative method, amount to the system is a displacement between current and previous position of a user s finger. In the case of manipulating with watching the touchpad, in relative method, we assume that the user feels wronged because the position that user touches and the position of the pointer are different. In this interface, in the case of watching the real space in which AR objects exist, the method is the only absolute method. Table 1 shows the procedure that manipulates an AR object in relative method. In the procedure of manipulating an AR object by relative method, at first it is necessary to move the pointer to a target in order to manipulate an AR object. Also, in order to switch the movable direction of the pointer, it is necessary to tilt vertically the touchpad (Figure 3 (a2, a3)). In the operation of the one finger, it is determined that drag event if the move event occur between the tapup event and tap-down event. If the move event does not occur between the tap-up event and tap-down event, it is determined that tap event. As shown in Figure 1 (b), in the absolute method, amount to the system is along with the exact touch position of the user s finger. In the case of manipulating with watching the real space in which AR objects exist, in the absolute method, it is very difficult to manipulate because the user cannot see its hand and One finger Two fingers Operation Select decide move rotate scale Table 1. method Procedure Move the pointer by touching and dragging, to superimpose on the target AR object Tap after select the target AR object. Two-finger-drag after select the target AR object. Two-finger-rotate after select the target AR object. Tow-finger-pinch after select the target AR object. Table 2. method Operation Procedure One finger Select and move Touch the target AR object, and drag it. Select and move Touch and nip the target AR object, and drag it. Two fingers Select and rotate Touch and nip the target AR object, and rotate it. Select and scale Touch and nip the target AR object and pinch it. Three fingers Scope move/scale Touch nothing and drag/pinch. (a1) : Character Measurement items Character time (b1) : Character (a2) : Horizontal alignment (b2) : Horizontal alignment Vertical alignment time Horizontal alignment time (a3) : Vertical alignment (b3) : Vertical alignment Figure 3: Outline of the task

4 - Operation with looking at the touchpad. - Operation with looking at the real space in which AR objects exist. And then, we carried out 15 times task each method. So as to deny order effect, we changed turn of pattern into each subject RESULT Figure 4 and Table 3 show completion times of task, Table 4 shows the average of typos, Figure 6 shows the probability of typos, and Figure 7show the result of the questionnaire about usability. In the selection task, there is p=0.000<0.05 significant difference according to the t-test between relative and absolute method. In addition, median number of typos is 0.00 in both methods, and there is p=0.000<0.05 significant difference according to the t-test Table 3. Completion time (mean S.D.) the position of touch with the touchpad. In this interface, in the case of watching the touchpad, the method is the only relative method. Table 2 shows the procedure that manipulates an AR object in absolute method. In the procedure of manipulating an AR object with the absolute method, the feature is that selection and other operation (ex. Translation, rotation, and scaling) are sequential. Tilting the touchpad, it is possible to change the movable direction of the pointer in the same manner as the relative method (Figure 3 (b2, b3)). We considered that the user switch the method depend on the task. Specifically, in laying out an AR object, we believe that the horizontal alignment of AR objects is preferred absolute method and vertical alignment is no difference between those two the method. 5. USER STUDY Figure 5: Apperance of user study In order to study the operability due to the difference of the amount and position to look (the touchpad or the real space in which AR objects exist). Specifically, 15 subjects were carried out character as selection task and the alignment of an annotation task. After the user study, we heard about usability of this interface with the questionnaire. The purpose to carry out the selection task is to compare the operability in the select operation of the amount of absolute and relative. On the other hand, the purpose to carry out the alignment task is to compare the operability with looking at the touchpad and real space in which AR objects exist SELECTION TASK (CHARACTER INPUT) AND ALIGNMENT TASK Figure 5 shows appearance of the user study and Figure 3 shows the outline of the task. In selection task, a word of five characters is randomly displayed. A task is finished with the case of pushing the Enter key after five characters are correctly, and then an annotation is created and added to AR space. Alignment task is putting the annotation created by the selection task on an AR object which is placed at random beforehand. Alignment task is divided into horizontal alignment task and vertical alignment task. The subjects always carried out in the order of selection task, horizontal alignment task, and vertical alignment task. Then, the alignment completion time on task is measured after an annotation making. We measured the completion time of task at the time of a user study start to the subjects. We evaluated in two patterns as follows this. Input method Task [s] [s] Selection Horizontal alignment Vertical alignment Selection task time [s] Alignment task (Horizontal) time [s] Figure 4: Completion time p=.015 Alignment task (Vertical) time [s] p=.541 p=.000 Difference of the means: 2.1 sec. Difference of the means: 0.8 sec. Difference of the means: 0.3 sec.

5 between relative and absolute method. In the horizontal alignment task, there is p=0.015<0.05 significant difference according to the t-test between relative and absolute method. The vertical alignment task is no difference according to the t-test between relative and absolute method. In the questionnaire about the speed of the selection task, there is p=0.002<0.05 significant difference according to the Wilcoxon signed-rank test between relative and absolute method. And, in the questionnaire about the accuracy of the selection task, there is p=0.013<0.05 significant difference according to the Wilcoxon signed-rank test between relative and absolute method. Other items are no difference according to the Wilcoxon signed-rank test between relative and absolute DISCUSSION SELECTION TASK (CHARACTER INPUT) In terms of the character speed, the difference of the mean time is approximately 2.1 seconds. In other words, the time that six characters are in the absolute method equals the time that five characters are in the relative method. According to [23], the number of average characters of the English word is 4.5 characters. For that reason, in the case of ting words, this time difference is not a big. In the questionnaire, a lot of subjects feel that the absolute method is faster than relative method. In the absolute method, it is possible to one character in a single operation. Also, a lot of subjects feel that the relative method is able to a key accurately. In the relative method, it is possible to enter the key with confirming the position of pointer and the key. On the other hand, in the absolute method, it is hard to the key because the key is disappeared into user s fingers. In summary, the absolute method with looking at the real space in which AR objects exist is superior to the relative method with looking at the touchpad. If touchpad size becomes smaller, there will be no difference in usability. We recommend to absolute method to character. However, we recommend to relative method to select an AR object because it is possible to see real space Table 4. method Input method median mean S.D Probability of typo user Figure 6: Probability of typo ALIGNMENT TASK In terms of the horizontal alignment task, the difference of the mean time is approximately 1.4 seconds. However, in the questionnaire about speed, there is no difference between relative and absolute method. The reason for the difference of time is the user can grasp the AR space for overlooking it by manipulating with looking at the touchpad. The more the user become far from an AR object in the real space, the more taken long time of horizontal alignment. In terms of the vertical alignment task, the difference of the mean time is approximately 0.1 seconds. We consider that the vertical alignment time does not change even if the user goes far from an AR object in real space. This task is pasting an AR object to other AR object (annotation). In the case of supposing to put an annotation to a real object, the problem that cannot align an AR object occur because the real object is not displayed on the touchpad. It is necessary to consider to be displayed the real environment on the touchpad. We believe to be displayed in a two-dimensional bird's-eye view of the real environment is good. There is also the idea to display a miniature of the actual environment on touch pad three-dimensionally, but a user is probably confused because the amount of the information is excessively increased. In summary, there is no difference in the usability between the absolute method with looking at the real space in which AR Selection task Alignment task (Horizontal) Alignment task (Vertical) p=.002 p=.013 bad good bad good bad good Figure 7: Questionnaire

6 objects exist and relative method with looking at the touchpad. In the case of becoming difficult to grasp the AR space, operation with looking at the touchpad will become superior to operate with looking at the real space. We recommend to relative method to align an AR object because it is possible to see real space COMBINATION OF HMD AND THE TOUCHPAD For a combination of HMD and the touchpad, in this user study, the user was operating stationary, but it is necessary to consider about miniaturized of HMD because our proposed interface may be carried out in situation that user is walking around an AR object. By rotating the scope, it is available at any from various angles. We consider that it is desirable to the scope is rotated 90 degrees in vertical axis. If the scope rotates continuously, in response to slight movement of the user, there is possibility that the scope would rotate during the operation of the user. In addition, by appending the function to move the pointer in the center of the user s view when the user shakes the device, it is possible to suppress the movement of the pointer. Also we should consider to way of overlaying the AR objects in the case of the physical touch surface is small. By expanding the display area by AR techniques, it might be possible to improve the problem of that the touch surface is small (Figure 8). 6. CONCLUSION In this paper, we developed the interface for operating AR objects. Focusing on the two types of the method on the touchpad, we studied each method through user study. In the selection task, the absolute method is superior to the relative method. In the alignment task, there is no difference in ease of use. In the future, it will be necessary to study the difference by the touchpad size. In addition, we need to improve the interface to be used to move around. REFERENCES Figure 8: Expanding the display [1] S. Na, M. Billinghurst and W. Woo, "TMAR: Extension of a Tabletop Interface Using Mobile Augmented Reality," in Transactions on Edutainment I, Lecture Notes in Computer Science Volume 5080, 2008, pp [2] S. Hashimoto, A. Ishida, M. Inami and T. Igarashi, "TouchMe - An Augmented Reality Based Remote Robot Manipulation," in The 21st International Conference on Artificial Reality and Telexistence, Proceedings of ICAT2011, [3] A. Marzo, B. Bossavit and O. Ardaiz, "Interacting with multi-touch handheld devices in augmented reality spaces: effects of marker and screen size," in Proceedings of the 13th International Conference on Interacción Persona- Ordenador (INTERACCION '12), [4] I. Poupyrev, M. Billinghurst, S. Weghorst and T. Ichikawa, "The go-go interaction technique: non-linear mapping for direct manipulation in VR," in Proceedings of the 9th annual ACM symposium on User interface software and technology, 1996, pp [5] J. D. Mulder, "Remote Object Translation Methods for Immersive Virtual Environments," in Virtual Environments '98, 1998, pp [6] H. Kato, K. Tachibana, M. Tanabe, T. Nakajima and Y. Fukuda, "MagicCup: a tangible interface for virtual objects manipulation in table-top augmented reality," in Augmented Reality Toolkit Workshop, IEEE International, 2003, pp [7] B. Thomas, W. Piekarski and B. Gunther, "Using Augmented Reality to Visualise Architecture Designs In An Outdoor Environment," in Proc. DCNnet'99 (Design computing on the Net, nov 30 - dec 3, University of Sydney, Australia), [8] W. Piekarski and B. H. Thomas, "The Tinmith system: demonstrating new techniques for mobile augmented reality modelling," in Australian Computer Science Communications Volume 24 Issue 4, January-February 2002, 2002, pp [9] W. Piekarski and B. H. Thomas, "Tinmith-evo5 - An Architecture for Supporting Mobile Augmented Reality Environments," in ISAR '01 Proceedings of the IEEE and ACM International Symposium on Augmented Reality (ISAR'01), 2001, p [10] W. Piekarski and B. H. Thomas, "Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality," in Proceedings of the 5th IEEE International Symposium on Wearable Computers (ISWC '01). IEEE Computer Society, Washington, DC, USA, [11] T. Hoang, S. R. Porter and B. H. Thomas, "Augmenting image plane AR 3D interactions for wearable computers," in Proceedings of the Tenth Australasian Conference on User Interfaces - Volume 93 (AUIC '09, 2009, pp [12] D. Schmalstieg, L. M. Encarnação and Z. Szalavári, "Using transparent props for interaction with the virtual table," in Proceedings of the 1999 symposium on Interactive 3D graphics (I3D '99), 1999, pp [13] Z. Szalavári and M. Gervautz, "The Personal Interaction Panel - A Two-Handed Interface for Augmented Reality," in Computer Graphics Forum, 16, 3 (Proceedings of EUROGRAPHICS'97, Budapest, Hungary), 1997, pp [14] B. Rahul, G. A. Lee and B. Mark, "Using a HHD with a HMD for Mobile AR Interaction," in Proceedings of the th IEEE International Symposium on Mixed and Augmented Reality, [15] S. Feiner, B. MacIntyre, T. Hollerer and A. Webster, "A Touring Machine - Prototyping 3D mobile augmented reality systems for exploring the urban environment.," in Proceedings of the 1st IEEE International Symposium on Wearable Computers (ISWC '97), 1997.

7 [16] R. Stoakley, M. J. Conway and R. Pausch, "Virtual reality on a WIM: interactive worlds in miniature," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '95), 1995, pp [17] V. Maquil, T. Psik, I. Wagner and W. M., "Expressive interactions - supporting collaboration in urban design," in Proceedings of the 2007 international ACM conference on Supporting group work (GROUP '07), 2007, pp [18] M. Valérie, S. Markus, S. Dieter and W. Ina, "MR Tent: a place for co-constructing mixed realities in urban planning," in Proceedings of Graphics Interface 2009 (GI '09), 2009, pp [19] D. C. McCallum and P. Irani, "ARC-Pad: absolute+relative cursor positioning for large displays with a mobile touchscreen," in Proceedings of the 22nd annual ACM symposium on User interface software and technology (UIST '09), 2009, pp [20] C. Forlines, D. Vogel and R. Balakrishnan, "HybridPointing: fluid switching between absolute and relative pointing with a direct device," in Proceedings of the 19th annual ACM symposium on User interface software and technology (UIST '06), 2006, pp [21] C. Forlines, D. Vogel, N. Kong and R. Balakrishnan, " vs. Direct Pen Input," in Mitsubishi Electric Research Labs Tech Report, TR , [22] S. P. Smith, E. L. Burd, L. Ma, I. Alagha and A. Hatch, " and Mappings for Rotating Remote 3D Objects on Multi-Touch Tabletops," in 25th BCS Conference on Human-Computer Interaction, BCS-HCI '11, [23] N. Sloane and A. Wyner, "Prediction and Entropy of Printed English," 1993.

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Virtual Object Manipulation using a Mobile Phone

Virtual Object Manipulation using a Mobile Phone Virtual Object Manipulation using a Mobile Phone Anders Henrysson 1, Mark Billinghurst 2 and Mark Ollila 1 1 NVIS, Linköping University, Sweden {andhe,marol}@itn.liu.se 2 HIT Lab NZ, University of Canterbury,

More information

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality The MIT Faculty has made this article openly available. Please share how this access benefits you. Your

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION CHYI-GANG KUO, HSUAN-CHENG LIN, YANG-TING SHEN, TAY-SHENG JENG Information Architecture Lab Department of Architecture National Cheng Kung University

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

Advanced Interaction Techniques for Augmented Reality Applications

Advanced Interaction Techniques for Augmented Reality Applications Advanced Interaction Techniques for Augmented Reality Applications Mark Billinghurst 1, Hirokazu Kato 2, and Seiko Myojin 2 1 The Human Interface Technology New Zealand (HIT Lab NZ), University of Canterbury,

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

Evaluation of Spatial Abilities through Tabletop AR

Evaluation of Spatial Abilities through Tabletop AR Evaluation of Spatial Abilities through Tabletop AR Moffat Mathews, Madan Challa, Cheng-Tse Chu, Gu Jian, Hartmut Seichter, Raphael Grasset Computer Science & Software Engineering Dept, University of Canterbury

More information

Handheld AR for Collaborative Edutainment

Handheld AR for Collaborative Edutainment Handheld AR for Collaborative Edutainment Daniel Wagner 1, Dieter Schmalstieg 1, Mark Billinghurst 2 1 Graz University of Technology Institute for Computer Graphics and Vision, Inffeldgasse 16 Graz, 8010

More information

Efficient In-Situ Creation of Augmented Reality Tutorials

Efficient In-Situ Creation of Augmented Reality Tutorials Efficient In-Situ Creation of Augmented Reality Tutorials Alexander Plopski, Varunyu Fuvattanasilp, Jarkko Polvi, Takafumi Taketomi, Christian Sandor, and Hirokazu Kato Graduate School of Information Science,

More information

Tangible Augmented Reality

Tangible Augmented Reality Tangible Augmented Reality Mark Billinghurst Hirokazu Kato Ivan Poupyrev HIT Laboratory Faculty of Information Sciences Interaction Lab University of Washington Hiroshima City University Sony CSL Box 352-142,

More information

Face to Face Collaborative AR on Mobile Phones

Face to Face Collaborative AR on Mobile Phones Face to Face Collaborative AR on Mobile Phones Anders Henrysson NVIS Linköping University andhe@itn.liu.se Mark Billinghurst HIT Lab NZ University of Canterbury mark.billinghurst@hitlabnz.org Mark Ollila

More information

Virtual Object Manipulation on a Table-Top AR Environment

Virtual Object Manipulation on a Table-Top AR Environment Virtual Object Manipulation on a Table-Top AR Environment H. Kato 1, M. Billinghurst 2, I. Poupyrev 3, K. Imamoto 1, K. Tachibana 1 1 Faculty of Information Sciences, Hiroshima City University 3-4-1, Ozuka-higashi,

More information

VIRTUAL REALITY AND SIMULATION (2B)

VIRTUAL REALITY AND SIMULATION (2B) VIRTUAL REALITY AND SIMULATION (2B) AR: AN APPLICATION FOR INTERIOR DESIGN 115 TOAN PHAN VIET, CHOO SEUNG YEON, WOO SEUNG HAK, CHOI AHRINA GREEN CITY 125 P.G. SHIVSHANKAR, R. BALACHANDAR RETRIEVING LOST

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch 1 2 Research Topic TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY Human-Computer Interaction / Natural User Interface Neng-Hao (Jones) Yu, Assistant Professor Department of Computer Science National

More information

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS NSF Lake Tahoe Workshop on Collaborative Virtual Reality and Visualization (CVRV 2003), October 26 28, 2003 AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS B. Bell and S. Feiner

More information

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSC 2524, Fall 2017 AR/VR Interaction Interface CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Multi-User Collaboration on Complex Data in Virtual and Augmented Reality

Multi-User Collaboration on Complex Data in Virtual and Augmented Reality Multi-User Collaboration on Complex Data in Virtual and Augmented Reality Adrian H. Hoppe 1, Kai Westerkamp 2, Sebastian Maier 2, Florian van de Camp 2, and Rainer Stiefelhagen 1 1 Karlsruhe Institute

More information

3D Interactions with a Passive Deformable Haptic Glove

3D Interactions with a Passive Deformable Haptic Glove 3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross

More information

An augmented-reality (AR) interface dynamically

An augmented-reality (AR) interface dynamically COVER FEATURE Developing a Generic Augmented-Reality Interface The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computing

More information

Augmented Reality- Effective Assistance for Interior Design

Augmented Reality- Effective Assistance for Interior Design Augmented Reality- Effective Assistance for Interior Design Focus on Tangible AR study Seung Yeon Choo 1, Kyu Souk Heo 2, Ji Hyo Seo 3, Min Soo Kang 4 1,2,3 School of Architecture & Civil engineering,

More information

Multimodal Feedback for Finger-Based Interaction in Mobile Augmented Reality

Multimodal Feedback for Finger-Based Interaction in Mobile Augmented Reality Multimodal Feedback for Finger-Based Interaction in Mobile Augmented Reality Wolfgang Hürst 1 1 Department of Information & Computing Sciences Utrecht University, Utrecht, The Netherlands huerst@uu.nl

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

Implementation of Image processing using augmented reality

Implementation of Image processing using augmented reality Implementation of Image processing using augmented reality Konjengbam Jackichand Singh 1, L.P.Saikia 2 1 MTech Computer Sc & Engg, Assam Downtown University, India 2 Professor, Computer Sc& Engg, Assam

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Smart-Phone Augmented Reality for Public Participation in Urban Planning

Smart-Phone Augmented Reality for Public Participation in Urban Planning Smart-Phone Augmented Reality for Public Participation in Urban Planning M. Allen Information Science University of Otago Dunedin, New Zealand allma658@student.otago.ac.nz ABSTRACT We investigate smart-phone

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α, Mark illinghurst β and Gerard Jounghyun Kim α α Virtual Reality Laboratory, Dept. of CSE, POSTECH, Pohang,

More information

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality by Rahul Budhiraja A thesis submitted in partial fulfillment of the requirements for the Degree of

More information

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer

More information

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems Eliminating Design and Execute Modes from Virtual Environment Authoring Systems Gary Marsden & Shih-min Yang Department of Computer Science, University of Cape Town, Cape Town, South Africa Email: gaz@cs.uct.ac.za,

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

ARK: Augmented Reality Kiosk*

ARK: Augmented Reality Kiosk* ARK: Augmented Reality Kiosk* Nuno Matos, Pedro Pereira 1 Computer Graphics Centre Rua Teixeira Pascoais, 596 4800-073 Guimarães, Portugal {Nuno.Matos, Pedro.Pereira}@ccg.pt Adérito Marcos 1,2 2 University

More information

Application and Taxonomy of Through-The-Lens Techniques

Application and Taxonomy of Through-The-Lens Techniques Application and Taxonomy of Through-The-Lens Techniques Stanislav L. Stoev Egisys AG stanislav.stoev@egisys.de Dieter Schmalstieg Vienna University of Technology dieter@cg.tuwien.ac.at ASTRACT In this

More information

Information Layout and Interaction on Virtual and Real Rotary Tables

Information Layout and Interaction on Virtual and Real Rotary Tables Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer System Information Layout and Interaction on Virtual and Real Rotary Tables Hideki Koike, Shintaro Kajiwara, Kentaro Fukuchi

More information

Augmented Board Games

Augmented Board Games Augmented Board Games Peter Oost Group for Human Media Interaction Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente Enschede, The Netherlands h.b.oost@student.utwente.nl

More information

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Fumihisa Shibata, Takashi Hashimoto, Koki Furuno, Asako Kimura, and Hideyuki Tamura Graduate School of Science and

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

Occlusion-Aware Menu Design for Digital Tabletops

Occlusion-Aware Menu Design for Digital Tabletops Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at

More information

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire Holger Regenbrecht DaimlerChrysler Research and Technology Ulm, Germany regenbre@igroup.org Thomas Schubert

More information

Tiles: A Mixed Reality Authoring Interface

Tiles: A Mixed Reality Authoring Interface Tiles: A Mixed Reality Authoring Interface Ivan Poupyrev 1,i, Desney Tan 2,i, Mark Billinghurst 3, Hirokazu Kato 4, 6, Holger Regenbrecht 5 & Nobuji Tetsutani 6 1 Interaction Lab, Sony CSL 2 School of

More information

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments Virtual Environments 1 A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based Virtual Environments Changming He, Andrew Lewis, and Jun Jo Griffith University, School of

More information

Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment

Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment Ownership and Copyright Springer-Verlag London Ltd Virtual Reality (2002) 6:167 180 Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment Wearable Computer Laboratory,

More information

Future Directions for Augmented Reality. Mark Billinghurst

Future Directions for Augmented Reality. Mark Billinghurst Future Directions for Augmented Reality Mark Billinghurst 1968 Sutherland/Sproull s HMD https://www.youtube.com/watch?v=ntwzxgprxag Star Wars - 1977 Augmented Reality Combines Real and Virtual Images Both

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

Integrating CFD, VR, AR and BIM for Design Feedback in a Design Process An Experimental Study

Integrating CFD, VR, AR and BIM for Design Feedback in a Design Process An Experimental Study Integrating CFD, VR, AR and BIM for Design Feedback in a Design Process An Experimental Study Nov. 20, 2015 Tomohiro FUKUDA Osaka University, Japan Keisuke MORI Atelier DoN, Japan Jun IMAIZUMI Forum8 Co.,

More information

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Arindam Dey PhD Student Magic Vision Lab University of South Australia Supervised by: Dr Christian Sandor and Prof.

More information

Immersive Authoring of Tangible Augmented Reality Applications

Immersive Authoring of Tangible Augmented Reality Applications International Symposium on Mixed and Augmented Reality 2004 Immersive Authoring of Tangible Augmented Reality Applications Gun A. Lee α Gerard J. Kim α Claudia Nelles β Mark Billinghurst β α Virtual Reality

More information

Augmented Reality Interface Toolkit

Augmented Reality Interface Toolkit Augmented Reality Interface Toolkit Fotis Liarokapis, Martin White, Paul Lister University of Sussex, Department of Informatics {F.Liarokapis, M.White, P.F.Lister}@sussex.ac.uk Abstract This paper proposes

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER DOWNLOAD EBOOK : AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

More information

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University Spring 2018 10 April 2018, PhD ghada@fcih.net Agenda Augmented reality (AR) is a field of computer research which deals with the combination of real-world and computer-generated data. 2 Augmented reality

More information

Interaction, Collaboration and Authoring in Augmented Reality Environments

Interaction, Collaboration and Authoring in Augmented Reality Environments Interaction, Collaboration and Authoring in Augmented Reality Environments Claudio Kirner1, Rafael Santin2 1 Federal University of Ouro Preto 2Federal University of Jequitinhonha and Mucury Valeys {ckirner,

More information

VR/AR Concepts in Architecture And Available Tools

VR/AR Concepts in Architecture And Available Tools VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality

More information

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Makoto Yoda Department of Information System Science Graduate School of Engineering Soka University, Soka

More information

Double-side Multi-touch Input for Mobile Devices

Double-side Multi-touch Input for Mobile Devices Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan

More information

Building a gesture based information display

Building a gesture based information display Chair for Com puter Aided Medical Procedures & cam par.in.tum.de Building a gesture based information display Diplomarbeit Kickoff Presentation by Nikolas Dörfler Feb 01, 2008 Chair for Computer Aided

More information

Mohammad Akram Khan 2 India

Mohammad Akram Khan 2 India ISSN: 2321-7782 (Online) Impact Factor: 6.047 Volume 4, Issue 8, August 2016 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information

Usability and Playability Issues for ARQuake

Usability and Playability Issues for ARQuake Usability and Playability Issues for ARQuake Bruce Thomas, Nicholas Krul, Benjamin Close and Wayne Piekarski University of South Australia Abstract: Key words: This paper presents a set of informal studies

More information

Augmented Reality: Its Applications and Use of Wireless Technologies

Augmented Reality: Its Applications and Use of Wireless Technologies International Journal of Information and Computation Technology. ISSN 0974-2239 Volume 4, Number 3 (2014), pp. 231-238 International Research Publications House http://www. irphouse.com /ijict.htm Augmented

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

Asymmetries in Collaborative Wearable Interfaces

Asymmetries in Collaborative Wearable Interfaces Asymmetries in Collaborative Wearable Interfaces M. Billinghurst α, S. Bee β, J. Bowskill β, H. Kato α α Human Interface Technology Laboratory β Advanced Communications Research University of Washington

More information

Using Transparent Props For Interaction With The Virtual Table

Using Transparent Props For Interaction With The Virtual Table Using Transparent Props For Interaction With The Virtual Table Dieter Schmalstieg 1, L. Miguel Encarnação 2, and Zsolt Szalavári 3 1 Vienna University of Technology, Austria 2 Fraunhofer CRCG, Inc., Providence,

More information

Industrial Use of Mixed Reality in VRVis Projects

Industrial Use of Mixed Reality in VRVis Projects Industrial Use of Mixed Reality in VRVis Projects Werner Purgathofer, Clemens Arth, Dieter Schmalstieg VRVis Zentrum für Virtual Reality und Visualisierung Forschungs-GmbH and TU Wien and TU Graz Some

More information

Vocabulary Game Using Augmented Reality Expressing Elements in Virtual World with Objects in Real World

Vocabulary Game Using Augmented Reality Expressing Elements in Virtual World with Objects in Real World Open Journal of Social Sciences, 2015, 3, 25-30 Published Online February 2015 in SciRes. http://www.scirp.org/journal/jss http://dx.doi.org/10.4236/jss.2015.32005 Vocabulary Game Using Augmented Reality

More information

Annotation Overlay with a Wearable Computer Using Augmented Reality

Annotation Overlay with a Wearable Computer Using Augmented Reality Annotation Overlay with a Wearable Computer Using Augmented Reality Ryuhei Tenmokuy, Masayuki Kanbara y, Naokazu Yokoya yand Haruo Takemura z 1 Graduate School of Information Science, Nara Institute of

More information

Presenting Past and Present of an Archaeological Site in the Virtual Showcase

Presenting Past and Present of an Archaeological Site in the Virtual Showcase 4th International Symposium on Virtual Reality, Archaeology and Intelligent Cultural Heritage (2003), pp. 1 6 D. Arnold, A. Chalmers, F. Niccolucci (Editors) Presenting Past and Present of an Archaeological

More information

A new user interface for human-computer interaction in virtual reality environments

A new user interface for human-computer interaction in virtual reality environments Original Article Proceedings of IDMME - Virtual Concept 2010 Bordeaux, France, October 20 22, 2010 HOME A new user interface for human-computer interaction in virtual reality environments Ingrassia Tommaso

More information

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments Robert W. Lindeman John L. Sibert James K. Hahn Institute for Computer Graphics The George Washington University

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Blended UI Controls For Situated Analytics

Blended UI Controls For Situated Analytics Blended UI Controls For Situated Analytics Neven A. M. ElSayed, Ross T. Smith, Kim Marriott and Bruce H. Thomas Wearable Computer Lab, University of South Australia Monash Adaptive Visualisation Lab, Monash

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

COLLABORATION SUPPORT SYSTEM FOR CITY PLANS OR COMMUNITY DESIGNS BASED ON VR/CG TECHNOLOGY

COLLABORATION SUPPORT SYSTEM FOR CITY PLANS OR COMMUNITY DESIGNS BASED ON VR/CG TECHNOLOGY COLLABORATION SUPPORT SYSTEM FOR CITY PLANS OR COMMUNITY DESIGNS BASED ON VR/CG TECHNOLOGY TOMOHIRO FUKUDA*, RYUICHIRO NAGAHAMA*, ATSUKO KAGA**, TSUYOSHI SASADA** *Matsushita Electric Works, Ltd., 1048,

More information

Survey of User-Based Experimentation in Augmented Reality

Survey of User-Based Experimentation in Augmented Reality Survey of User-Based Experimentation in Augmented Reality J. Edward Swan II Department of Computer Science & Engineering Mississippi State University Box 9637 Mississippi State, MS, USA 39762 (662) 325-7507

More information

Collaborative Visualization in Augmented Reality

Collaborative Visualization in Augmented Reality Collaborative Visualization in Augmented Reality S TUDIERSTUBE is an augmented reality system that has several advantages over conventional desktop and other virtual reality environments, including true

More information

Object Impersonation: Towards Effective Interaction in Tablet- and HMD-Based Hybrid Virtual Environments

Object Impersonation: Towards Effective Interaction in Tablet- and HMD-Based Hybrid Virtual Environments Object Impersonation: Towards Effective Interaction in Tablet- and HMD-Based Hybrid Virtual Environments Jia Wang * Robert W. Lindeman HIVE Lab HIVE Lab Worcester Polytechnic Institute Worcester Polytechnic

More information

Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education

Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education 47 Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education Alena Kovarova Abstract: Interaction takes an important role in education. When it is remote, it can bring

More information

Virtual and Augmented Reality for Cabin Crew Training: Practical Applications

Virtual and Augmented Reality for Cabin Crew Training: Practical Applications EATS 2018: the 17th European Airline Training Symposium Virtual and Augmented Reality for Cabin Crew Training: Practical Applications Luca Chittaro Human-Computer Interaction Lab Department of Mathematics,

More information

Bimanual Handheld Mixed Reality Interfaces for Urban Planning

Bimanual Handheld Mixed Reality Interfaces for Urban Planning Bimanual Handheld Mixed Reality Interfaces for Urban Planning Markus Sareika Graz University of Technology Inffeldgasse 16 A-8010 Graz +43 316 873 5076 markus@sareika.de Dieter Schmalstieg Graz University

More information

Mid-term report - Virtual reality and spatial mobility

Mid-term report - Virtual reality and spatial mobility Mid-term report - Virtual reality and spatial mobility Jarl Erik Cedergren & Stian Kongsvik October 10, 2017 The group members: - Jarl Erik Cedergren (jarlec@uio.no) - Stian Kongsvik (stiako@uio.no) 1

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate Immersive Training David Lafferty President of Scientific Technical Services And ARC Associate Current Situation Great Shift Change Drive The Need For Training Conventional Training Methods Are Expensive

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information