Study of the touchpad interface to manipulate AR objects

Similar documents
ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

AR 2 kanoid: Augmented Reality ARkanoid

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Virtual Object Manipulation using a Mobile Phone

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Toward an Augmented Reality System for Violin Learning Support

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Advanced Interaction Techniques for Augmented Reality Applications

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

New interface approaches for telemedicine

Guidelines for choosing VR Devices from Interaction Techniques

Augmented Reality Lecture notes 01 1

Evaluation of Spatial Abilities through Tabletop AR

Handheld AR for Collaborative Edutainment

Efficient In-Situ Creation of Augmented Reality Tutorials

Tangible Augmented Reality

Face to Face Collaborative AR on Mobile Phones

Virtual Object Manipulation on a Table-Top AR Environment

VIRTUAL REALITY AND SIMULATION (2B)

3D Interaction Techniques

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

CSC 2524, Fall 2017 AR/VR Interaction Interface

Chapter 1 - Introduction

Multi-User Collaboration on Complex Data in Virtual and Augmented Reality

3D Interactions with a Passive Deformable Haptic Glove

An augmented-reality (AR) interface dynamically

Augmented Reality- Effective Assistance for Interior Design

Multimodal Feedback for Finger-Based Interaction in Mobile Augmented Reality

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Implementation of Image processing using augmented reality

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Augmented and mixed reality (AR & MR)

3D and Sequential Representations of Spatial Relationships among Photos

The Mixed Reality Book: A New Multimedia Reading Experience

Smart-Phone Augmented Reality for Public Participation in Urban Planning

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems

Beyond: collapsible tools and gestures for computational design

ARK: Augmented Reality Kiosk*

Application and Taxonomy of Through-The-Lens Techniques

Information Layout and Interaction on Virtual and Real Rotary Tables

Augmented Board Games

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems

ISCW 2001 Tutorial. An Introduction to Augmented Reality

Occlusion-Aware Menu Design for Digital Tabletops

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction

Tiles: A Mixed Reality Authoring Interface

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

Glove Based User Interaction Techniques for Augmented Reality in an Outdoor Environment

Future Directions for Augmented Reality. Mark Billinghurst

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Integrating CFD, VR, AR and BIM for Design Feedback in a Design Process An Experimental Study

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Immersive Authoring of Tangible Augmented Reality Applications

Augmented Reality Interface Toolkit

Augmented Reality And Ubiquitous Computing using HCI

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Interaction, Collaboration and Authoring in Augmented Reality Environments

VR/AR Concepts in Architecture And Available Tools

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Double-side Multi-touch Input for Mobile Devices

Building a gesture based information display

Mohammad Akram Khan 2 India

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Usability and Playability Issues for ARQuake

Augmented Reality: Its Applications and Use of Wireless Technologies

Interior Design using Augmented Reality Environment

A Kinect-based 3D hand-gesture interface for 3D databases

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Interactive intuitive mixed-reality interface for Virtual Architecture

Geo-Located Content in Virtual and Augmented Reality

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Asymmetries in Collaborative Wearable Interfaces

Using Transparent Props For Interaction With The Virtual Table

Industrial Use of Mixed Reality in VRVis Projects

Vocabulary Game Using Augmented Reality Expressing Elements in Virtual World with Objects in Real World

Annotation Overlay with a Wearable Computer Using Augmented Reality

Presenting Past and Present of an Archaeological Site in the Virtual Showcase

A new user interface for human-computer interaction in virtual reality environments

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Blended UI Controls For Situated Analytics

Ubiquitous Home Simulation Using Augmented Reality

COLLABORATION SUPPORT SYSTEM FOR CITY PLANS OR COMMUNITY DESIGNS BASED ON VR/CG TECHNOLOGY

Survey of User-Based Experimentation in Augmented Reality

Collaborative Visualization in Augmented Reality

Object Impersonation: Towards Effective Interaction in Tablet- and HMD-Based Hybrid Virtual Environments

Analysing Different Approaches to Remote Interaction Applicable in Computer Assisted Education

Virtual and Augmented Reality for Cabin Crew Training: Practical Applications

Bimanual Handheld Mixed Reality Interfaces for Urban Planning

Mid-term report - Virtual reality and spatial mobility

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Application of 3D Terrain Representation System for Highway Landscape Design

Transcription:

Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for AR objects on the touchpad, we propose the relative method with grasping the touchpad with still looking at the real space in which AR object exist. Specifically, a pointer is displayed in the AR space and it is possible to manipulate an AR object by operating the pointer by treating the amount as the relative amount of the touchpad grasped in hand. In the same way, we believe that there is a scene which they should treat the amount as the absolute amount of the touchpad is suitable for AR environment. In the case of treating the amount as the absolute amount, it is hard to operate with looking at the real space in which AR objects exist. Then, by overlaying the miniature of the AR objects, it is possible to operate. Due to this, the user can manipulate an AR object out of reach precisely In this paper, we study the operability in the case performing an operation with looking at the real space in which AR objects exist and operation with looking at the touchpad. As a result, in the character as selection task, the absolute method with looking at the touchpad is sooner than the relative method with looking at the real space in which AR objects exist. In the alignment task, there is no difference in completion time either method. Keywords: Augmented Reality, Remote Manipulation, Touchpad, HMD. Index Terms: D.2.2 [Design Tools and Techniques]: User interfaces ; H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities 1. INTRODUCTION These days, AR is widely used in various fields. By overlaying seamlessly an AR object to the user's view, the user is able to obtain the various information without causing excessive eye movement. In the case of interacting with AR objects that are superimposed on a real-world environment, by working with watching the real space in which AR object exist, reducing the vision shift amount and maintain high immersion. Thus, in previous research, by looking at the AR space through such as smartphones and tablets [1, 2, 3], it is possible to operate directory. Wearing the HMD, it is possible to associate seamlessly AR space and real space. In addition, using glove type device, it is possible to operate AR objects in the same manner as real-world objects. In the case of AR objects are positioned out of reach, using the extension method of arm [4, 5], *1 nagashima@nishilab.sys.es.osaka-u.ac.jp *2 sakata@nishilab.sys.es.osaka-u.ac.jp *3 nishida@nishilab.sys.es.osaka-u.ac.jp it becomes possible to operate. However, there are several problems such as reducing the flexibility and accuracy. Then, we propose the relative method for quick operation can be accurately an AR object with grasping the touchpad with still looking at the real space in which AR object exist. Specifically, a pointer is displayed in the AR space and it is possible to manipulate an AR object by operating the pointer by treating the amount as the relative amount of the touchpad grasped in hand. In the same way, we believe that there is a scene which they should treat the amount as the absolute amount of the touchpad is suitable for AR environment. In the case of treating the amount as the absolute amount, it is hard to operate with looking at the real space in which AR objects exist. Then, by overlaying the miniature of the AR objects, it is possible to operate. Due to this, the user can manipulate an AR object out of reach. In this paper, we study the operability in the case performing an operation with looking at the real space in which AR objects exist and operation with looking at the touchpad. 2. RELATED WORKS As to the region of the VR and AR, we show the interface to manipulate objects in the virtual space and studying the method in pointing. 2.1. VR/AR INTERFACES We enumerate the interfaces that refer to manipulating virtual objects not only AR. MagicCup [6] is the cup type device, and Tinmith [7, 8, 9, 10, 11] is the glove type device. In the case of using these devices in AR, it is possible to manipulate an AR object in hand range accurately, but in the case of AR object is not in hand range, precision worsens remarkably. Schmalstieg, et al. [12] and Szalavári, et al. [13] are the interface to manipulate a virtual object by defining the plane in the virtual space, and using the plane like touch-panel. TouchMe [2] is an interface utilizing the touch-panel, and [3, 1] are the interface utilizing the mobile phone. These interfaces are low flexibility because not utilizing the HMD. Rahul, et al. [14] were referred to the various method in the system of collaborating with handheld touch-panel and HMD. A Touring Machine [15] is the interface pasting annotations to real landscape. However this did not mention three-dimensional. World in Miniature (WIM) [16, 17] is the method a real object in the virtual object by associating a real object. In MRTent [18], a limit such as not being able to float an AR object in the air occurs because of physical limitation. 2.2. INPUT METHOD We enumerate the interfaces that refer to relative and absolute method. ARC-Pad [19] is the pointing interface manipulating the mouse cursor of PC by utilizing the smart phone. HybridPointing [20, 21] is the pointing interface for the giant screen. Smith, et al. [22] were referred to relative/absolute method for touch-panel interaction with virtual object in VR space. There is not the adaptation example to any AR.

3. COLLABORATED WITH TOUCHPAD AND HMD We developed an interface that combines the HMD and touchpad. Figure 1 (a) shows the appearance of the system. Video-See- Though AR was created by attaching the monocular head-mounted camera (HMC) between the eyes of the HMD (Figure 1 (b)). AR objects are superimposed on the real-time images by captured from the HMC. The field of view of the attached camera is much wider than the HMD, and so any objects displayed on the HMD become smaller when viewed by the HMC. Consequentially, by trimming and using the center part of the captured image of the camera, we can provide reasonable viewing via the HMD as naturally as possible. We use the ArUco which based on OpenCV for setting the coordinate system on the touchpad and landscape. By recognizing the visual markers surrounding the touchpad and the visual markers on the landscape, the place where coordinate systems are placed are decided. Attaching the many markers (Figure 1(c)) on the frame of the touchpad, we realized the robust tracking of the touchpad surface with considering occlusion by hand in the case of manipulating an AR object with watching the touchpad. It is possible to manipulate quickly and accurately AR object even if away from the user, as shown in Figure 1(d), by displaying the miniatures of AR objects on the touchpad. By displaying the pointer on the landscape, it will be able to manipulate with watching landscape as well. The scope described in landscape is a view-area on the touchpad, and it will be understood the direction to move the pointer for the user. If the pointer overlaps an AR object or the user touches an AR with watching the touchpad, the touchpad is vibrated and clap the sound. (a) (b) Figure 2: Input method HMDandHMC SonyHMZ-T2 -Resolutionis720p. -Horizontalangleofviewis77degrees. Touchpad (a) Appearance of the system SonyPlayStationeye -Resolutionis640x480 -Framerateis60fps. -Angleofviewis75degrees. (b) HMD and HMC Openthekeyboard button Pointer Scope AsusNexus7-7-inch(180mm)diagonalIPSLCD -10multi-touchpoints ARobjects ARobjects MiniaturesofARobjects (c) Touchpad (d) AR objects and miniatures of AR objects Figure 1: Concept of the system

4. INPUT METHOD FOR TOUCHPAD IN AR ENVIRONMENT There are two methods, relative and absolute method, as the method of the touchpad. As shown in Figure 1 (a), in relative method, amount to the system is a displacement between current and previous position of a user s finger. In the case of manipulating with watching the touchpad, in relative method, we assume that the user feels wronged because the position that user touches and the position of the pointer are different. In this interface, in the case of watching the real space in which AR objects exist, the method is the only absolute method. Table 1 shows the procedure that manipulates an AR object in relative method. In the procedure of manipulating an AR object by relative method, at first it is necessary to move the pointer to a target in order to manipulate an AR object. Also, in order to switch the movable direction of the pointer, it is necessary to tilt vertically the touchpad (Figure 3 (a2, a3)). In the operation of the one finger, it is determined that drag event if the move event occur between the tapup event and tap-down event. If the move event does not occur between the tap-up event and tap-down event, it is determined that tap event. As shown in Figure 1 (b), in the absolute method, amount to the system is along with the exact touch position of the user s finger. In the case of manipulating with watching the real space in which AR objects exist, in the absolute method, it is very difficult to manipulate because the user cannot see its hand and One finger Two fingers Operation Select decide move rotate scale Table 1. method Procedure Move the pointer by touching and dragging, to superimpose on the target AR object Tap after select the target AR object. Two-finger-drag after select the target AR object. Two-finger-rotate after select the target AR object. Tow-finger-pinch after select the target AR object. Table 2. method Operation Procedure One finger Select and move Touch the target AR object, and drag it. Select and move Touch and nip the target AR object, and drag it. Two fingers Select and rotate Touch and nip the target AR object, and rotate it. Select and scale Touch and nip the target AR object and pinch it. Three fingers Scope move/scale Touch nothing and drag/pinch. (a1) : Character Measurement items Character time (b1) : Character (a2) : Horizontal alignment (b2) : Horizontal alignment Vertical alignment time Horizontal alignment time (a3) : Vertical alignment (b3) : Vertical alignment Figure 3: Outline of the task

- Operation with looking at the touchpad. - Operation with looking at the real space in which AR objects exist. And then, we carried out 15 times task each method. So as to deny order effect, we changed turn of pattern into each subject. 5.2. RESULT Figure 4 and Table 3 show completion times of task, Table 4 shows the average of typos, Figure 6 shows the probability of typos, and Figure 7show the result of the questionnaire about usability. In the selection task, there is p=0.000<0.05 significant difference according to the t-test between relative and absolute method. In addition, median number of typos is 0.00 in both methods, and there is p=0.000<0.05 significant difference according to the t-test Table 3. Completion time (mean S.D.) the position of touch with the touchpad. In this interface, in the case of watching the touchpad, the method is the only relative method. Table 2 shows the procedure that manipulates an AR object in absolute method. In the procedure of manipulating an AR object with the absolute method, the feature is that selection and other operation (ex. Translation, rotation, and scaling) are sequential. Tilting the touchpad, it is possible to change the movable direction of the pointer in the same manner as the relative method (Figure 3 (b2, b3)). We considered that the user switch the method depend on the task. Specifically, in laying out an AR object, we believe that the horizontal alignment of AR objects is preferred absolute method and vertical alignment is no difference between those two the method. 5. USER STUDY Figure 5: Apperance of user study In order to study the operability due to the difference of the amount and position to look (the touchpad or the real space in which AR objects exist). Specifically, 15 subjects were carried out character as selection task and the alignment of an annotation task. After the user study, we heard about usability of this interface with the questionnaire. The purpose to carry out the selection task is to compare the operability in the select operation of the amount of absolute and relative. On the other hand, the purpose to carry out the alignment task is to compare the operability with looking at the touchpad and real space in which AR objects exist. 5.1. SELECTION TASK (CHARACTER INPUT) AND ALIGNMENT TASK Figure 5 shows appearance of the user study and Figure 3 shows the outline of the task. In selection task, a word of five characters is randomly displayed. A task is finished with the case of pushing the Enter key after five characters are correctly, and then an annotation is created and added to AR space. Alignment task is putting the annotation created by the selection task on an AR object which is placed at random beforehand. Alignment task is divided into horizontal alignment task and vertical alignment task. The subjects always carried out in the order of selection task, horizontal alignment task, and vertical alignment task. Then, the alignment completion time on task is measured after an annotation making. We measured the completion time of task at the time of a user study start to the subjects. We evaluated in two patterns as follows this. Input method Task [s] [s] Selection 9.8 2.9 7.7 3.5 Horizontal alignment 5.2 2.5 4.6 2.9 Vertical alignment 3.7 2.0 3.6 1.6 Selection task time [s] Alignment task (Horizontal) time [s] Figure 4: Completion time p=.015 Alignment task (Vertical) time [s] p=.541 p=.000 Difference of the means: 2.1 sec. Difference of the means: 0.8 sec. Difference of the means: 0.3 sec.

between relative and absolute method. In the horizontal alignment task, there is p=0.015<0.05 significant difference according to the t-test between relative and absolute method. The vertical alignment task is no difference according to the t-test between relative and absolute method. In the questionnaire about the speed of the selection task, there is p=0.002<0.05 significant difference according to the Wilcoxon signed-rank test between relative and absolute method. And, in the questionnaire about the accuracy of the selection task, there is p=0.013<0.05 significant difference according to the Wilcoxon signed-rank test between relative and absolute method. Other items are no difference according to the Wilcoxon signed-rank test between relative and absolute. 5.3. DISCUSSION 5.3.1. SELECTION TASK (CHARACTER INPUT) In terms of the character speed, the difference of the mean time is approximately 2.1 seconds. In other words, the time that six characters are in the absolute method equals the time that five characters are in the relative method. According to [23], the number of average characters of the English word is 4.5 characters. For that reason, in the case of ting words, this time difference is not a big. In the questionnaire, a lot of subjects feel that the absolute method is faster than relative method. In the absolute method, it is possible to one character in a single operation. Also, a lot of subjects feel that the relative method is able to a key accurately. In the relative method, it is possible to enter the key with confirming the position of pointer and the key. On the other hand, in the absolute method, it is hard to the key because the key is disappeared into user s fingers. In summary, the absolute method with looking at the real space in which AR objects exist is superior to the relative method with looking at the touchpad. If touchpad size becomes smaller, there will be no difference in usability. We recommend to absolute method to character. However, we recommend to relative method to select an AR object because it is possible to see real space. 0.6 0.4 0.2 0 Table 4. method Input method median mean S.D. 0 0.04 0.21 0 0.22 0.41 Probability of typo 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 user Figure 6: Probability of typo 5.3.2. ALIGNMENT TASK In terms of the horizontal alignment task, the difference of the mean time is approximately 1.4 seconds. However, in the questionnaire about speed, there is no difference between relative and absolute method. The reason for the difference of time is the user can grasp the AR space for overlooking it by manipulating with looking at the touchpad. The more the user become far from an AR object in the real space, the more taken long time of horizontal alignment. In terms of the vertical alignment task, the difference of the mean time is approximately 0.1 seconds. We consider that the vertical alignment time does not change even if the user goes far from an AR object in real space. This task is pasting an AR object to other AR object (annotation). In the case of supposing to put an annotation to a real object, the problem that cannot align an AR object occur because the real object is not displayed on the touchpad. It is necessary to consider to be displayed the real environment on the touchpad. We believe to be displayed in a two-dimensional bird's-eye view of the real environment is good. There is also the idea to display a miniature of the actual environment on touch pad three-dimensionally, but a user is probably confused because the amount of the information is excessively increased. In summary, there is no difference in the usability between the absolute method with looking at the real space in which AR Selection task Alignment task (Horizontal) Alignment task (Vertical) p=.002 p=.013 bad good bad good bad good Figure 7: Questionnaire

objects exist and relative method with looking at the touchpad. In the case of becoming difficult to grasp the AR space, operation with looking at the touchpad will become superior to operate with looking at the real space. We recommend to relative method to align an AR object because it is possible to see real space. 5.4. COMBINATION OF HMD AND THE TOUCHPAD For a combination of HMD and the touchpad, in this user study, the user was operating stationary, but it is necessary to consider about miniaturized of HMD because our proposed interface may be carried out in situation that user is walking around an AR object. By rotating the scope, it is available at any from various angles. We consider that it is desirable to the scope is rotated 90 degrees in vertical axis. If the scope rotates continuously, in response to slight movement of the user, there is possibility that the scope would rotate during the operation of the user. In addition, by appending the function to move the pointer in the center of the user s view when the user shakes the device, it is possible to suppress the movement of the pointer. Also we should consider to way of overlaying the AR objects in the case of the physical touch surface is small. By expanding the display area by AR techniques, it might be possible to improve the problem of that the touch surface is small (Figure 8). 6. CONCLUSION In this paper, we developed the interface for operating AR objects. Focusing on the two types of the method on the touchpad, we studied each method through user study. In the selection task, the absolute method is superior to the relative method. In the alignment task, there is no difference in ease of use. In the future, it will be necessary to study the difference by the touchpad size. In addition, we need to improve the interface to be used to move around. REFERENCES Figure 8: Expanding the display [1] S. Na, M. Billinghurst and W. Woo, "TMAR: Extension of a Tabletop Interface Using Mobile Augmented Reality," in Transactions on Edutainment I, Lecture Notes in Computer Science Volume 5080, 2008, pp. 96-106. [2] S. Hashimoto, A. Ishida, M. Inami and T. Igarashi, "TouchMe - An Augmented Reality Based Remote Robot Manipulation," in The 21st International Conference on Artificial Reality and Telexistence, Proceedings of ICAT2011, 2010. [3] A. Marzo, B. Bossavit and O. Ardaiz, "Interacting with multi-touch handheld devices in augmented reality spaces: effects of marker and screen size," in Proceedings of the 13th International Conference on Interacción Persona- Ordenador (INTERACCION '12), 2012. [4] I. Poupyrev, M. Billinghurst, S. Weghorst and T. Ichikawa, "The go-go interaction technique: non-linear mapping for direct manipulation in VR," in Proceedings of the 9th annual ACM symposium on User interface software and technology, 1996, pp. 79-80. [5] J. D. Mulder, "Remote Object Translation Methods for Immersive Virtual Environments," in Virtual Environments '98, 1998, pp. 80-89. [6] H. Kato, K. Tachibana, M. Tanabe, T. Nakajima and Y. Fukuda, "MagicCup: a tangible interface for virtual objects manipulation in table-top augmented reality," in Augmented Reality Toolkit Workshop, 2003. IEEE International, 2003, pp. 75-76. [7] B. Thomas, W. Piekarski and B. Gunther, "Using Augmented Reality to Visualise Architecture Designs In An Outdoor Environment," in Proc. DCNnet'99 (Design computing on the Net, nov 30 - dec 3, University of Sydney, Australia), 1999. [8] W. Piekarski and B. H. Thomas, "The Tinmith system: demonstrating new techniques for mobile augmented reality modelling," in Australian Computer Science Communications Volume 24 Issue 4, January-February 2002, 2002, pp. 61-70. [9] W. Piekarski and B. H. Thomas, "Tinmith-evo5 - An Architecture for Supporting Mobile Augmented Reality Environments," in ISAR '01 Proceedings of the IEEE and ACM International Symposium on Augmented Reality (ISAR'01), 2001, p. 177. [10] W. Piekarski and B. H. Thomas, "Tinmith-Metro: New Outdoor Techniques for Creating City Models with an Augmented Reality," in Proceedings of the 5th IEEE International Symposium on Wearable Computers (ISWC '01). IEEE Computer Society, Washington, DC, USA, 2001. [11] T. Hoang, S. R. Porter and B. H. Thomas, "Augmenting image plane AR 3D interactions for wearable computers," in Proceedings of the Tenth Australasian Conference on User Interfaces - Volume 93 (AUIC '09, 2009, pp. 9-15. [12] D. Schmalstieg, L. M. Encarnação and Z. Szalavári, "Using transparent props for interaction with the virtual table," in Proceedings of the 1999 symposium on Interactive 3D graphics (I3D '99), 1999, pp. 147-157. [13] Z. Szalavári and M. Gervautz, "The Personal Interaction Panel - A Two-Handed Interface for Augmented Reality," in Computer Graphics Forum, 16, 3 (Proceedings of EUROGRAPHICS'97, Budapest, Hungary), 1997, pp. 335-346. [14] B. Rahul, G. A. Lee and B. Mark, "Using a HHD with a HMD for Mobile AR Interaction," in Proceedings of the 2013 12th IEEE International Symposium on Mixed and Augmented Reality, 2013. [15] S. Feiner, B. MacIntyre, T. Hollerer and A. Webster, "A Touring Machine - Prototyping 3D mobile augmented reality systems for exploring the urban environment.," in Proceedings of the 1st IEEE International Symposium on Wearable Computers (ISWC '97), 1997.

[16] R. Stoakley, M. J. Conway and R. Pausch, "Virtual reality on a WIM: interactive worlds in miniature," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '95), 1995, pp. 265-272. [17] V. Maquil, T. Psik, I. Wagner and W. M., "Expressive interactions - supporting collaboration in urban design," in Proceedings of the 2007 international ACM conference on Supporting group work (GROUP '07), 2007, pp. 69-78. [18] M. Valérie, S. Markus, S. Dieter and W. Ina, "MR Tent: a place for co-constructing mixed realities in urban planning," in Proceedings of Graphics Interface 2009 (GI '09), 2009, pp. 211-214. [19] D. C. McCallum and P. Irani, "ARC-Pad: absolute+relative cursor positioning for large displays with a mobile touchscreen," in Proceedings of the 22nd annual ACM symposium on User interface software and technology (UIST '09), 2009, pp. 153-156. [20] C. Forlines, D. Vogel and R. Balakrishnan, "HybridPointing: fluid switching between absolute and relative pointing with a direct device," in Proceedings of the 19th annual ACM symposium on User interface software and technology (UIST '06), 2006, pp. 211-220. [21] C. Forlines, D. Vogel, N. Kong and R. Balakrishnan, " vs. Direct Pen Input," in Mitsubishi Electric Research Labs Tech Report, TR2006-066, 2006. [22] S. P. Smith, E. L. Burd, L. Ma, I. Alagha and A. Hatch, " and Mappings for Rotating Remote 3D Objects on Multi-Touch Tabletops," in 25th BCS Conference on Human-Computer Interaction, BCS-HCI '11, 2011. [23] N. Sloane and A. Wyner, "Prediction and Entropy of Printed English," 1993.