Virtual Object Manipulation using a Mobile Phone

Size: px
Start display at page:

Download "Virtual Object Manipulation using a Mobile Phone"

Transcription

1 Virtual Object Manipulation using a Mobile Phone Anders Henrysson 1, Mark Billinghurst 2 and Mark Ollila 1 1 NVIS, Linköping University, Sweden {andhe,marol}@itn.liu.se 2 HIT Lab NZ, University of Canterbury, New Zealand mark.billinghurst@hitlabnz.org Abstract Augmented Reality (AR) on mobile phones has reached a level of maturity where it can be used as a tool for 3D object manipulation. In this paper we look at user interface issues where an AR enabled mobile phone acts as an interaction device. We discuss how traditional 3D manipulation techniques apply to this new platform. The high tangibility of the device and its button interface makes it interesting to compare manipulation techniques. We describe AR manipulation techniques we have implemented on a mobile phone and present a small pilot study evaluating these methods. Key words: Augmented Reality, Mobile Phone, Manipulation 1. Introduction Augmented Reality (AR) is a technology that allows a user to see virtual imagery overlaid and registered with the real world. Traditionally the AR content was viewed through a head mounted display (HMD). Wearing an HMD leaves the users hands free to interact with the virtual content, either directly or using an input device such as a mouse or digital glove. In recent years AR applications have migrated to other platforms, including Tablet PCs [27], PDAs [28] and mobile phones [18]. The mobile phone is an ideal platform for augmented reality (AR). The current generation of phones have full colour displays, integrated cameras, fast processors and even dedicated 3D graphics chips. Henrysson [10] and Moehring [18] have shown how mobile phones can be used for simple single user AR applications. In their work they create custom computer vision libraries that allows developers to build video see through AR applications that run on a mobile phone. For handheld and mobile phone based AR the user looks through the screen of the device to view the AR scene and needs at least one hand to hold the device. The user interface for these applications is very different than those for HMD based AR applications. Thus there is a need to conduct research on interaction techniques for handheld AR displays, and to produce formal user studies to evaluate these techniques. This is important because the widespread adoption of mobile phones means that this platform could be one of the dominant platforms for AR applications in the near future. In this paper we present one of the first mobile phone AR applications in which the user can manipulate the virtual objects being shown. We explore several possible manipulation techniques and conduct a user study to identify which of these techniques is the most usable. The work that we present here is not specifically for mobile phone AR applications. It is also useful for other AR applications that run in a handheld form factor, and for general 3D graphics applications on a phone or handheld devices. The need for effective manipulation techniques is common across a wide variety of application areas. In the next section we review related work in the area of virtual object manipulation, especially on a handheld platform. We then describe several approaches that we have implemented, and a user study to test between approaches. Finally we provide some design recommendations and directions for future research. 2. Related Work The need for methods to select and manipulate virtual objects is basic to many types of graphics applications. Bowman et. al. [2] identify three basic object manipulation tasks: selection, positioning, and rotation. They also provide a taxonomy for classifying manipulation techniques. On a desktop user interface, selection is commonly performed by mouse input and once selected, objects are also manipulated with mouse input. For three dimensional manipulation, the challenge is using 2D mouse input to control 3D virtual object translation and orientation. In a conventional interface slider, controls can be used with each slider controlling one degree of positional or rotational freedom [4]. Alternatively, 3D widgets can be used which place virtual handles on objects which can then be translated or rotated by clicking and dragging on the handles [12]. The most challenging interaction on the desktop is setting the orientation of a virtual object. Virtual Sphere [4] and Virtual Trackball [13] techniques can be used to map 2D mouse motions onto the surface of a virtual sphere that surrounds the object being rotated. Three dimensional object manipulation is more natural in Virtual Reality (VR) environments. In this case users can reach out grab objects with their hands [23] or

2 employ a variety of ray-casting [3] and interaction at a distance techniques. Some of these techniques can also be employed in AR environments, particularly in HMD based systems. In AR interfaces there is also a close relationship between the virtual imagery and real world and so some metaphors, such as Tangible AR interaction methods [14] use real object manipulation to interact with virtual content. Researchers have begun conducting formal user studies with HMD based AR systems, but there has been far less research in handheld interfaces. There are several examples of handheld AR interfaces where the user interacts with the content rather than just viewing it. For example, in Rekimoto s Transvision interface [25] two users sit across the table and see shared AR content shown on handheld LCDs panels. They can select objects by ray casting and once selected objects are fixed related to the LCD and can be moved. The ARPAD interface [17] is similar, but it adds a handheld controller to the LCD panel. Selection is performed by positioning virtual cross hairs over the object and hitting a controller button. Once selected the object is fixed in space relative to the LCD panel and so can be moved by moving the panel. The object can also be rotated using a trackball input device, thus ARPAD decouples translation and rotation. More recently the Invisible train [28] uses a PDA to view AR content and users can select virtual models directly by clicking on the model with a stylus. Similar stylus based selection has been implemented in AR interfaces that run on tablet PCs [27]. Despite these examples of selection and manipulation techniques on handheld displays there have been no formal usability studies to see what are the best methods to use on a handheld device. As we show in the next section, handheld devices are different enough from desktop and immersive VR interfaces that the research results gathered from user studies in non-handheld environments may not be applicable. On the mobile phone there are very few examples of AR applications and none of them support more than simple object selection and manipulation. For example, Mohering [18] has developed a simple AR viewing application, however there is no selection or manipulation of virtual objects possible. Our early work with AR tennis [10] is a mobile phone based collaborative AR application that allows users to hit a virtual ball over a net, but this also does not support more complex interaction. There are also simpler examples of mobile phone games that feature graphics overlaid on video of the real world, although without 3D registration of the graphics as is normal in an AR application. The Siemen s Mosquito game [19] shows virtual mosquitos that can be killed with a simple point and shoot metaphor. The virtual soccer game of KickReal [15] allows people to see a virtual ball superimposed over video of the real world and kick it with their feet, but again there is no 3D object manipulation. Like AR Tennis, the CamBall application [8] allows users to hit balls at each other, although with limited 3D tracking. In none of these cases has there been a formal evaluation of their usability. There have been some efforts to implement non-ar 3D graphics applications on mobile phones. There are a range of games that provide joystick type control of vehicles and objects in 3D environments. Most of the control techniques are adopted from console interaction metaphors. Larsen et. al. [16] describe one of the first 3D applications for the mobile with more complex object manipulation. This is a brick modeling program where the user selects and moves virtual bricks using the arrow keys on the phone. Once again, there is no evaluation of the usability of the technique. In contrast, in our work we have developed an AR application that runs on the mobile phone and supports selection, translation and rotation of 3D virtual objects using a variety of techniques adopted from desktop and AR user interfaces. We also evaluate these interaction techniques using a formal user study. 3. Interaction Methods In order to explore methods for manipulation in AR applications on a mobile phone we need to consider the appropriate interaction metaphor. There are a number of important differences between using a mobile phone AR interface and a traditional desktop interface, including: - limited input options (no mouse/keyboard) - limited screen resolution - little graphics support - reduced processing power Similarly, compared to a traditional HMD based AR system, in an AR application on a phone the display is handheld rather than headworn, and the display and input device are connected. Finally, compared to a PDA the mobile phone is operated using a one-handed button interface in contrast to a two-hand stylus interaction. These differences mean that interface metaphors developed for Desktop and HMD based systems may not be appropriate for handheld phone based systems. For example, applications developed with a Tangible AR metaphor [14] often assume that the user has both hands free to manipulate physical input devices which will not be the case with mobile phones. We need to develop input techniques that can be used one handed and only rely on a joypad and keypad input. Since the phone is handheld we can use the motion of the phone itself to interact with the virtual object. For example, as in ARPAD, we can fix the virtual object relative to the phone and then position objects by moving the phone relative to the real world. Two handed interaction techniques [11] can also be explored; one hand holding the phone and the second a real object on which AR graphics are overlaid. This approach assumes

3 that phone is like a handheld lens giving a small view into the AR scene. In this case the user may be more likely move the phone-display than change their viewpoint relative to the phone. The small form factor of the phone lets us explore more object-based interaction techniques based around motion of the phone itself. Given these requirements there are several possible manipulation methods that could be tried. Table 1 shows the techniques we have implemented. Positioning A/ Tangible 1: The object is fixed relative to the phone and moves when the user moves the phone. When released the object position is set to the final translated position while its orientation is reset to its original orientation. B/ Keypad/Joypad: The selected object is continuously translated in the X, Y or Z directions depending on the buttons currently held down. C/ Tangible 2: The same as tangible 1, but the user can use bimanual input, moving both the phone and the object that the phone is tracked relative to. Rotation A/ ArcBall [4]: When the phone moves the relative motion of the phone is used as input into the arcball technique to rotate the currently selected object. B/ Keypad/Joypad: The object rotates about its own axis according to joypad and keypad input. Left and right joypad input causes rotation left and right about the vertical axis etc. C/ Tangible 1: The object is fixed relative to the phone and moves when the user moves the phone. When released the object orientation is set to the final phone orientation and position reset to its original position. D/ Tangible 2: The same as tangible 1, but the user can use bimanual input, moving both the phone and the object that the phone is being tracked relative to. for the Tangible Input cases separate positioning and rotation. In the Tangible Input cases the virtual model is fixed in space relative to the phone and so can be positioned and translated at the same time. However once the person de-selects the model the rotation or position of the model is reset back to its original depending on if we are conducting a positioning or rotation user study. In the keypad/joypad method the objects continuously rotate or translate a fixed amount for each fraction of a second while the buttons are pressed. In contrast when the virtual object is fixed relative to the phone (Tangible Input), the user can move the object as fast as they can move the phone. So the user should be able to translate or rotate the objects faster using tangible input techniques than with keypad input. 3. Platform In order to implement an AR application on the phone and these various manipulation techniques it was necessary to develop a custom low level computer vision library for Symbian based mobile phones. This is described in complete detail in an earlier paper [9]. In this section we provide a brief overview of the computer vision work we have done and then focus more on the new code we have developed for the manipulation techniques. Our mobile phone AR platform is based on our earlier custom port of the ARToolKit computer vision tracking library [1] to the Symbian operating system [10]. ARToolKit can be used to calculate the 3D pose of a camera relative to a single square tracking marker. Although designed for the PC platform, our Symbian port of ARToolKit is able to run on current mobile phones at 6 7 frames per second. Creating the ARToolKit port involved creating an optimized fixed point library for image processing on the phone. Figure 1 shows our AR application running on the mobile phone. When the square marker is in view a virtual image appears overlaid on it in the camera view. Table 1: Methods for Translation and Rotation In the Tangible Input cases the translation and rotation techniques are applied to objects that are selected by positioning virtual cross hairs over them and clicking and holding down the joypad controller. Objects are deselected by releasing the joypad controller. For the Keypad and ArcBall methods the user just has to click on the keypad to start the motion. In our initial study we wanted to consider positioning and rotation separately. So all of the techniques except Figure 1: Our AR Application on the Mobile Phone In addition to the tracking software we needed graphic application code. The OpenGL library is a powerful

4 graphics API that was the natural starting point for the development of a graphics API for mobile devices. Our graphics application was developed using OpenGL ES which is a reduced subset of OpenGL 1.3, suitable for low-power, embedded devices. The phone we were developing for, the Nokia 6630, ships with a software implementation of OpenGL ES [21]. All of the manipulation techniques mentioned in section 3 were coded in OpenGL ES and can run on the Nokia series 60 phones with an integrated camera. For our experiment we used two Nokia 6630 phones. The Nokia 6630 has a 220Mhz processor and an integrated 1.3 megapixel camera. The screen size is 178 x 208 pixels. On these phones the manipulation applications typically ran at 6-7 frames per second with a video capture resolution of 160x120 pixels. The object being manipulated is a virtual box with the dimension of 80x64x32 units. We implemented two different techniques for translating the object. In the first case the object remains at a fixed transformation relative to the camera while selected. Selection is performed by pressing the joypad. Each object has a unique alpha value and the selection is accomplished by sampling the alpha value of the central pixel, indicated by a crosshair. When the object is released a new transform is calculated and its rotation component set to the unit matrix. In the second case the box is translated by the pressing of keys two for each dimension. To translate the object in the x-y plane we use the four directions of the joypad and complement it with the 2 and 5 keys for translation along the y-axis. The translation speed is 4 units/frame yielding a speed of about 30 units per second. At each update an error vector is calculated by taking the current position minus the goal position. The block is regarded to have been placed correctly if the length of the error vector less than 8 units. For the rotation we added a second block since a single block is rotation invariant. We implemented the two most important rotation techniques found in 3D applications the arcball and rotation around the object axis along with the isomorphic case where the object is fixed relative to the phone. The arcball allows the user to perform large 3DOF rotations using small movements. Our arcball was implemented using the code provided by NeHe Productions[20]. In its original use, the mouse pointer is used to manipulating an invisible ball that contains the object to be rotated. The resulting rotation depends on where on the ball the user clicked and in which direction the pointer was dragged. In our case the center of the bottom block is projected into screen coordinated and the crosshair act as a mouse pointer around which the screen is centered. Our implementation has some limitations. In particular the arcball is assumed to be manipulated from roughly the same viewpoint at all times since its internal rotation is not updated when the camera moves unless clicked. There is also a jumping artifact at the very first clicking. For rotation using the keypad we use the joypad to rotate around the x and z-axis, while the 2 and 5 buttons rotate the object around the y-axis. The speed of rotation is 4 degrees per update i.e. around 30 degrees per second. The Tangible Input condition attaches the virtual object to the camera just like in the translation case. However, with the difference that in this case when the object is released it is the translation part that is set to zero. The error metric for the rotation conditions is the sum of the absolute values of the difference between the current rotation matrix and the goal matrix. The rotation is regarded to be correct if the sum is less than User Study In order to test the usability of the manipulation techniques described earlier we conducted a study in which users tried to position and orient blocks. The subject sits at a table, which has a piece of paper with a number of ARToolKit tracking makers printed on it. When the user looks through the phone display at the tracking marker they will see a virtual ground plane with a virtual block on it and a wireframe image of the block. The study was done in two parts. In the first we tested the following three positioning conditions: A: Object fixed to the phone (one handed) B: Button and keypad input C: Object fixed to the phone (bimanual) In each case the goal was to select and move the block until it was inside the target wireframe block (see figure 2). In the bimanual case the user is able to move both the phone and the piece of paper that the virtual model appears attached to. In all other cases the subject wasn t allowed to move the tracking marker, although they could stand and walk around the table. Fig. 2 A virtual block and translation target

5 In the second part of the experiment we tested the following rotation techniques: A: Arcball D: Keypad input for rotation about the object axis B: Object fixed to the phone (one handed) C: Object fixed to the phone (bimanual) For each condition the virtual block was shown inside a wireframe copy and the goal was the rotate the block until it matched the orientation of the wireframe copy (see figure 3). In the bimanual case the user was able to rotate the tracking paper in one hand while moving the phone in the other, while in the other conditions the user wasn t able to move the tracking marker. In both the translation and rotation case the user was able to practice in each condition before trying the experimental task. Once they felt comfortable with the technique they also performed the task three times for each condition with virtual blocks at different positions and orientations. The order of trying the conditions was counterbalanced to remove order effects. 4.1 Results We recruited a total of 9 subjects for the user studies, 7 male and 2 female, aged between 22 and 32. None of the subjects had experience with 3D object manipulation on mobile phones but all of them had used mobile phones before and some of them had played games on their mobile phone. Positioning There was a significant difference in the time it took users to position objects depending on the positioning technique they used. Figure 4 shows the average time it took the users to position the virtual block in the wireframe target. Time (s) A B C Fig. 3 A virtual block and rotation target When the block was positioned or rotated correctly inside the target wire-frame it changed color to yellow showing the subject that the trial was over. This was determined by measuring the error in position or orientation and stopping the trial once this error value dropped below a certain threshold. For each trial we measured the amount of time it took the user to complete the trial and also continuously logged the position or rotation of the block relative to the target. After three trials in one condition we asked the subject to subjectively rate his or her performance and how easy was it for them to use the manipulation technique. Finally after all the positioning or orientation conditions were completed we asked the users to rank all them in order of ease of use and asked them some interview questions. Fig. 4 Average Positioning Times As can be see conditions A and C take less time that the keypad condition (condition B). Using a one factor ANOVA (F(2,24) = 3.65, P< 0.05) we find a significant difference in task completion times. The users also subjectively preferred condition A. Subjects were asked to answer the following questions: Q1: How easy was it for you to position the object? Q2: How accurately did you think you placed the block? Q3: How quickly did you think you placed the block? Q4: How enjoyable was the experience? Using a scale of 1 to 7 where 1= very easy, 7 = not very easy, etc. Table 2 shows the average results. A B C Q Q Q Q Table 2. Subjective Results As can be seen, the users thought that when the object was fixed to the phone (conditions A and C) it was easier to position the object correctly (Q1) but it they could position the model more accurately (Q2) with the keypad input. A one factor ANOVA finds a near significant difference in the results for Q1 (F(2,24) = 2.88, P = 0.076) and Q2 (F(2,24) = 3.32, P = 0.053).

6 There is a significant difference in the other conditions. The users thought they could place the objects more quickly when they were attached to the phone (Q3) and the tangible interfaces were more enjoyable (Q4). A one factor ANOVA finds a significant difference in the results for Q3 (F(2,24) = 5.13, P < 0.05) and Q4 (F(2,24) = 3.47, P < 0.05). The users were asked to rank the conditions in order of ease of use (1 = easiest, 3 = most difficult). Table 3 shows the average ranking. Condition A and C are the best ranked conditions. A one factor ANOVA gives a significant difference between conditions (F(2,24) = 5.36, P < 0.05). Table 4. Subjective Rotation Results There were no significant differences between these survey responses. The subjects thought that the conditions were equally easy to use and enjoyable. The users were asked to rank the conditions in order of ease of use (1 = easiest, 5 = most difficult). Table 5 shows the average ranking. There is no significant difference between the results, a one factor ANOVA finding (F(3,32) = 0.82, P = 0.49). A B C D Rank Table 5. Ranking of Orientation Results A B C Rank Observations Table 3. Average Positioning Rank (1= highest) Orientation There was also a significant difference in the time it took users to orient objects depending on the technique they used. Figure 5 shows the average time it took the users to rotate the virtual block to match the wireframe target. Time (s) A B C D Fig. 5 Average Rotation Times As can be seen, conditions A (arcball) and B (keypad input) are on average twice as fast as the Tangible Input rotation conditions (C and D). A one-factor ANOVA finds a significant difference between these times (F(3,32) = 4.60, P < 0.01). Subjects were also asked to answer the same survey questions as in the translation task, except Q1 was changed to: Q1: How easy was it for you to rotate the virtual object? Table 4 shows the average results. Although the interfaces were designed to be used one handed it was interesting to observe how they were actually used. When subjects used the keypad or arcball conditions they would typically hold or steady the phone with their non-dominant hand and push the keys with their dominant hand (see figure 6). This was to provide support for when they pressed the keys. They also typically remained seated since they didn t need to move the phone much to translate or rotate the model. In contrast, with the conditions where the virtual object appeared attached to the phone the user would hold the phone with one hand (clicking the joypad) and stand up and move around the tracking pattern to get a better view. This was especially true in the rotation case where they often had to rotate the phone to extreme angle to get the object rotation they wanted. In the bi-manual case, users would typically sit and use their non-dominant hand to rotate the target tracking pattern, while moving the phone in the other hand (See figure 7). About half of the subjects in the translation bi-manual case chose not to move the paper with their free hand, but almost all did in the rotation case. A B C D Q Q Q Q Figure 6 Holding the phone with both hands User Feedback Fig. 7 A subject using bimanual input In addition to survey responses many users gave additional comments about the experience. Several commented that when the virtual object was attached to

7 the phone they felt like they were holding it, compared to the case where the keypad was used and they felt that they were looking at a screen. One user said when the object was attached to the phone, the phone felt more like a tool. They felt like they were more in control and they could use their spatial abilities when manipulating the virtual object. In contrast those that preferred the keypad liked how it could be used for precise movements and also how you didn t need to physically move themselves to rotate the object about its axis. Some users also commented on a lack of visual feedback about the rotation axis. The block changed color when it was released inside the target but subjects thought it would have been good to change before it was released. They also felt visual cues showing the axis of rotation would be helpful, especially in the case of the arcball. Several of the users also had trouble with the computer vision tracking failing. When part of the marker was covered up then the virtual objects would disappear. However this was very temporary and most subjects adapted their behavior to prevent this. Those subjects that did use two handed input said that they felt they had more control because they could make gross movements with the camera and then fine tune the block position with small movements of the marker. 4.3 Discussion In this pilot study we have explored a variety of methods for rotation and translation of virtual objects. The results show that using a tangible interface metaphor provides a fast way to position AR objects in a mobile phone interface because the user just has to move the real phone where the block is to go. The subjects also felt that it was more enjoyable. However, there seems to be little advantage in using our implementation of a tangible interface metaphor for virtual object rotation. When the virtual object is fixed to the phone then the user often has to move themselves and the phone to rotate the object to the orientation they want, which takes time. Even when the person can use a second hand to rotate the tracking marker, this is still more time consuming than using the arcball or keypad input. One of the main advantages of the keypad is that just rotates the object around one axis at a time and so makes it easy for the user understand what the rotation axis is and how to undo any mistakes. There is also a compromise between speed and accuracy that may also affect performance. Tangible input techniques may be fast, but because they provide full six degree of freedom input, they may not be the best methods for precise input. This was shown in the rotation study where more precise input was needed to correctly align the models. 5. Conclusion In this paper we have reviewed some of the issues that must be considered when designing AR interfaces for mobile phones and have presented a pilot study evaluating different types of virtual object manipulation techniques. Our results suggest that virtual object positioning based on physical phone motion could be a valuable technique but rotation may be better performed through keypad input about constrained axes. However this is just a pilot study. In the future, we will need to conduct more rigorous studies with different tasks. In particular we need to explore manipulation when object position is not decoupled from rotation, such as 3D path following. In this case the need for rapid six degree of freedom input may mean that a tangible interface metaphor for both position and rotation may have a significant advantage over other techniques. 6. Acknowledgement The first author is funded by the Department of Science and Technology at the University of Linköping as well receiving supervisory support from the Swedish National Graduate School in Computer Science. References [1] ARToolKit site: [2] Bowman, D., Kruijff, E., LaVoila, J., Poupyrev, I. 3D User Interfaces: Theory and Practice. Adisson-Wesley [3] Bowman, D., Hodges, L. (1997) An Evaluation of Technqiues for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments. Proceedings of the 1997 ACM Symposium on Interactive 3D Graphics (I3D 97), ACM Press, pp [4] Chen, M., Mountford, S., Sellen, A. (1988) A Study in Interactive 3-D Rotation Using 2-D Control Devices. Computer Graphics 22(4): [5] Cutting D. J. C. D., Assad M. and Hudson A. AR phone: Accessible augmented reality in the intelligent environment. In OZCHI2003, Brisbane, [6] Feiner T. H. S., MacIntyre B. and Webster T. A touring machine: Prototyping 3d mobile augmented reality systems for exploring the urban environment. In Proc. ISWC 97 (First IEEE Int. Symp. On Wearable Computers), Cambridge, MA, [7] Geiger C, Kleinjohann B, Reimann C, Stichling D. Mobile AR4ALL, ISAR 2001, The Second IEEE and ACM International Symposium on Augmented Reality, New York, (2001). [8] Hakkarainen, M., Woodward., C., SymBall - Camera driven table tennis for mobile phones, submitted to ACM SIGCHI International Conference on Advances in Computer Entertainment Technology (ACE 2005), Valencia, Spain, June, [9] Henrysson, A., Billinghurst, M., Ollila, M. Face to Face Collaborative AR on Mobile Phones. In Proceedings of ISMAR 2005, 2005 (To Appear) Vienna, Austria. [10] Henrysson A. and Ollila M. UMAR - Ubiquitous Mobile Augmented Reality In Proc. Third International Conference on Mobile and Ubiquitous Multimedia

8 (MUM2004) pp College Park, Maryland, U.S.A. October 27-29, [11] Hinckley, K., Pausch, R., Proffitt, D., Patten, J., Kassell, N., Cooperative Bimanual Action, ACM CHI'97 Conference on Human Factors in Computing Systems, pp [12] Houde, S. (1992) Iterative Design of an Interface for Easy 3-D Direct Manipulation. Proceedings of the 1992 ACM Conference on Human Factors in Computing Systems (CHI 92), ACM Press, pp [13] Hultquits, J. (1990) A Virtual Trackball. Graphics Gems I, Academic Press, pp [14] Kato H., Billinghurst M., Poupyrev I., Tetsutani N. and Tachibana K. Tangible Augmented Reality for Human Computer Interaction. In Proc. of Nicograph (Nagoya, Japan, 2001). [15] KickReal website: [16] Larsen, B., Bærentzen, J., Christensen, N. Using cellular phones to interact with virtual environments ACM Siggraph 2002, Conference Abstracts and Applications, pp [17] Mogilev, D., Kiyokawa, K., Billinghurst, M., Pair, J..AR Pad: An Interface for Face-to-face AR Collaboration, Proc. of the ACM Conference on Human Factors in Computing Systems 2002 (CHI '02), Minneapolis, pp , [18] Moehring, M., Lessig, C. and Bimber, O. Video See- Through AR on Consumer Cell Phones. In Proc. of International Symposium on Augmented and Mixed Reality (ISMAR'04), pp , [19] Mosquito Hunt. newsdesk_archive/2003/foe03111.html [20] NeHe Productions: lesson.asp?lesson=48 [21] OpenGL ES Site: [22Piekarski, W. and Thomas, B. H. Tinmith-Hand: Unified User Interface Technology for Mobile Outdoor Augmented Reality and Indoor Virtual Reality. In IEEE Virtual Reality Conference, Orlando, Fl, Mar [23] Poupyrev, I., Billinghurst, M., Weghorst, S., Ichikawa, T. (1996) The Go-Go Interaction Technique: Non-Linear Mapping for Direct Manipulation in VR. Proceedings of the 1996 ACM Symposium on User Interface Software and Technology (UIST 96), ACM Press, pp [24] Reitmayr, G., Schmalstieg, D. Mobile Collaborative Augmented Reality In Proc. ISAR 2001, New York, USA, Oct [25] Rekimoto J., TransVision: A Hand-held Augmented Reality System for Collaborative Design. Virtual Systems and Multi-Media (VSMM)'96, [26] Thomas, B., Close, B., Donoghue, J., Squires, J., De Bondi, P., Morris, M., and Piekarski, W. ARQuake: An Outdoor/Indoor Augmented Reality First Person Application. Proc. 4th Int'l Symposium on Wearable Computers, pp , Atlanta, Ga, USA, Oct [27] Träskbäck M., Haller M. Mixed reality training application for an oil refinery: user requirements. In ACM SIGGRAPH International Conference on Virtual Reality Continuum and its Applications in Industry, VRCAI 2004, pp , Singapore. [28] Wagner, D., Schmalstieg, D.: First steps towards handheld augmented reality. Proc. of the 7th International Symposium on Wearable Computers (ISWC2003), White Plains, NY, USA, IEEE Computer Society (2003)

Face to Face Collaborative AR on Mobile Phones

Face to Face Collaborative AR on Mobile Phones Face to Face Collaborative AR on Mobile Phones Anders Henrysson NVIS Linköping University andhe@itn.liu.se Mark Billinghurst HIT Lab NZ University of Canterbury mark.billinghurst@hitlabnz.org Mark Ollila

More information

Advanced Interaction Techniques for Augmented Reality Applications

Advanced Interaction Techniques for Augmented Reality Applications Advanced Interaction Techniques for Augmented Reality Applications Mark Billinghurst 1, Hirokazu Kato 2, and Seiko Myojin 2 1 The Human Interface Technology New Zealand (HIT Lab NZ), University of Canterbury,

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Handheld AR for Collaborative Edutainment

Handheld AR for Collaborative Edutainment Handheld AR for Collaborative Edutainment Daniel Wagner 1, Dieter Schmalstieg 1, Mark Billinghurst 2 1 Graz University of Technology Institute for Computer Graphics and Vision, Inffeldgasse 16 Graz, 8010

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Study of the touchpad interface to manipulate AR objects

Study of the touchpad interface to manipulate AR objects Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

Efficient In-Situ Creation of Augmented Reality Tutorials

Efficient In-Situ Creation of Augmented Reality Tutorials Efficient In-Situ Creation of Augmented Reality Tutorials Alexander Plopski, Varunyu Fuvattanasilp, Jarkko Polvi, Takafumi Taketomi, Christian Sandor, and Hirokazu Kato Graduate School of Information Science,

More information

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009 MIRACLE: Mixed Reality Applications for City-based Leisure and Experience Mark Billinghurst HIT Lab NZ October 2009 Looking to the Future Mobile devices MIRACLE Project Goal: Explore User Generated

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

Tangible Augmented Reality

Tangible Augmented Reality Tangible Augmented Reality Mark Billinghurst Hirokazu Kato Ivan Poupyrev HIT Laboratory Faculty of Information Sciences Interaction Lab University of Washington Hiroshima City University Sony CSL Box 352-142,

More information

Immersive Authoring of Tangible Augmented Reality Applications

Immersive Authoring of Tangible Augmented Reality Applications International Symposium on Mixed and Augmented Reality 2004 Immersive Authoring of Tangible Augmented Reality Applications Gun A. Lee α Gerard J. Kim α Claudia Nelles β Mark Billinghurst β α Virtual Reality

More information

VIRTUAL REALITY AND SIMULATION (2B)

VIRTUAL REALITY AND SIMULATION (2B) VIRTUAL REALITY AND SIMULATION (2B) AR: AN APPLICATION FOR INTERIOR DESIGN 115 TOAN PHAN VIET, CHOO SEUNG YEON, WOO SEUNG HAK, CHOI AHRINA GREEN CITY 125 P.G. SHIVSHANKAR, R. BALACHANDAR RETRIEVING LOST

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

Future Directions for Augmented Reality. Mark Billinghurst

Future Directions for Augmented Reality. Mark Billinghurst Future Directions for Augmented Reality Mark Billinghurst 1968 Sutherland/Sproull s HMD https://www.youtube.com/watch?v=ntwzxgprxag Star Wars - 1977 Augmented Reality Combines Real and Virtual Images Both

More information

Augmented Board Games

Augmented Board Games Augmented Board Games Peter Oost Group for Human Media Interaction Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente Enschede, The Netherlands h.b.oost@student.utwente.nl

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α, Mark illinghurst β and Gerard Jounghyun Kim α α Virtual Reality Laboratory, Dept. of CSE, POSTECH, Pohang,

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction

Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire. Introduction Measuring Presence in Augmented Reality Environments: Design and a First Test of a Questionnaire Holger Regenbrecht DaimlerChrysler Research and Technology Ulm, Germany regenbre@igroup.org Thomas Schubert

More information

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION CHYI-GANG KUO, HSUAN-CHENG LIN, YANG-TING SHEN, TAY-SHENG JENG Information Architecture Lab Department of Architecture National Cheng Kung University

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

Augmented Reality: Its Applications and Use of Wireless Technologies

Augmented Reality: Its Applications and Use of Wireless Technologies International Journal of Information and Computation Technology. ISSN 0974-2239 Volume 4, Number 3 (2014), pp. 231-238 International Research Publications House http://www. irphouse.com /ijict.htm Augmented

More information

Augmented Reality- Effective Assistance for Interior Design

Augmented Reality- Effective Assistance for Interior Design Augmented Reality- Effective Assistance for Interior Design Focus on Tangible AR study Seung Yeon Choo 1, Kyu Souk Heo 2, Ji Hyo Seo 3, Min Soo Kang 4 1,2,3 School of Architecture & Civil engineering,

More information

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Arindam Dey PhD Student Magic Vision Lab University of South Australia Supervised by: Dr Christian Sandor and Prof.

More information

Usability and Playability Issues for ARQuake

Usability and Playability Issues for ARQuake Usability and Playability Issues for ARQuake Bruce Thomas, Nicholas Krul, Benjamin Close and Wayne Piekarski University of South Australia Abstract: Key words: This paper presents a set of informal studies

More information

Interaction, Collaboration and Authoring in Augmented Reality Environments

Interaction, Collaboration and Authoring in Augmented Reality Environments Interaction, Collaboration and Authoring in Augmented Reality Environments Claudio Kirner1, Rafael Santin2 1 Federal University of Ouro Preto 2Federal University of Jequitinhonha and Mucury Valeys {ckirner,

More information

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems Eliminating Design and Execute Modes from Virtual Environment Authoring Systems Gary Marsden & Shih-min Yang Department of Computer Science, University of Cape Town, Cape Town, South Africa Email: gaz@cs.uct.ac.za,

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

School of Computer and Information Science

School of Computer and Information Science School of Computer and Information Science CIS Research Placement Report Augmented Reality on the Android Mobile Platform Jan-Felix Schmakeit Date: 08/11/2009 Supervisor: Professor Bruce Thomas Abstract

More information

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Robert J. Teather, Robert S. Allison, Wolfgang Stuerzlinger Department of Computer Science & Engineering York University Toronto, Canada

More information

A new user interface for human-computer interaction in virtual reality environments

A new user interface for human-computer interaction in virtual reality environments Original Article Proceedings of IDMME - Virtual Concept 2010 Bordeaux, France, October 20 22, 2010 HOME A new user interface for human-computer interaction in virtual reality environments Ingrassia Tommaso

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSC 2524, Fall 2017 AR/VR Interaction Interface CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?

More information

Natural Gesture Based Interaction for Handheld Augmented Reality

Natural Gesture Based Interaction for Handheld Augmented Reality Natural Gesture Based Interaction for Handheld Augmented Reality A thesis submitted in partial fulfilment of the requirements for the Degree of Master of Science in Computer Science By Lei Gao Supervisors:

More information

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments Robert W. Lindeman 1 John L. Sibert 1 James N. Templeman 2 1 Department of Computer Science

More information

An augmented-reality (AR) interface dynamically

An augmented-reality (AR) interface dynamically COVER FEATURE Developing a Generic Augmented-Reality Interface The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computing

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments Weidong Huang 1, Leila Alem 1, and Franco Tecchia 2 1 CSIRO, Australia 2 PERCRO - Scuola Superiore Sant Anna, Italy {Tony.Huang,Leila.Alem}@csiro.au,

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

Mid-term report - Virtual reality and spatial mobility

Mid-term report - Virtual reality and spatial mobility Mid-term report - Virtual reality and spatial mobility Jarl Erik Cedergren & Stian Kongsvik October 10, 2017 The group members: - Jarl Erik Cedergren (jarlec@uio.no) - Stian Kongsvik (stiako@uio.no) 1

More information

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

USABILITY AND PLAYABILITY ISSUES FOR ARQUAKE

USABILITY AND PLAYABILITY ISSUES FOR ARQUAKE USABILITY AND PLAYABILITY ISSUES FOR ARQUAKE Bruce Thomas, Nicholas Krul, Benjamin Close and Wayne Piekarski University of South Australia Abstract: Key words: This paper presents a set of informal studies

More information

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction

More information

An exploration from virtual to augmented reality gaming

An exploration from virtual to augmented reality gaming SIMULATION & GAMING, Sage Publications, December, 37(4): 507-533, (2006). DOI: 10.1177/1046878106293684 An exploration from virtual to augmented reality gaming Fotis Liarokapis City University, UK Computer

More information

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

Theory and Practice of Tangible User Interfaces Tuesday, Week 9 Augmented Reality Theory and Practice of Tangible User Interfaces Tuesday, Week 9 Outline Overview Examples Theory Examples Supporting AR Designs Examples Theory Outline Overview Examples Theory Examples

More information

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect Peter Dam 1, Priscilla Braz 2, and Alberto Raposo 1,2 1 Tecgraf/PUC-Rio, Rio de Janeiro, Brazil peter@tecgraf.puc-rio.br

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

Mohammad Akram Khan 2 India

Mohammad Akram Khan 2 India ISSN: 2321-7782 (Online) Impact Factor: 6.047 Volume 4, Issue 8, August 2016 International Journal of Advance Research in Computer Science and Management Studies Research Article / Survey Paper / Case

More information

Designing Explicit Numeric Input Interfaces for Immersive Virtual Environments

Designing Explicit Numeric Input Interfaces for Immersive Virtual Environments Designing Explicit Numeric Input Interfaces for Immersive Virtual Environments Jian Chen Doug A. Bowman Chadwick A. Wingrave John F. Lucas Department of Computer Science and Center for Human-Computer Interaction

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

Augmented Reality Interface Toolkit

Augmented Reality Interface Toolkit Augmented Reality Interface Toolkit Fotis Liarokapis, Martin White, Paul Lister University of Sussex, Department of Informatics {F.Liarokapis, M.White, P.F.Lister}@sussex.ac.uk Abstract This paper proposes

More information

3D Interactions with a Passive Deformable Haptic Glove

3D Interactions with a Passive Deformable Haptic Glove 3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross

More information

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments Robert W. Lindeman John L. Sibert James K. Hahn Institute for Computer Graphics The George Washington University

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses

More information

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments Cleber S. Ughini 1, Fausto R. Blanco 1, Francisco M. Pinto 1, Carla M.D.S. Freitas 1, Luciana P. Nedel 1 1 Instituto

More information

Interaction Metaphor

Interaction Metaphor Designing Augmented Reality Interfaces Mark Billinghurst, Raphael Grasset, Julian Looser University of Canterbury Most interactive computer graphics appear on a screen separate from the real world and

More information

Upper Austria University of Applied Sciences (Media Technology and Design)

Upper Austria University of Applied Sciences (Media Technology and Design) Mixed Reality @ Education Michael Haller Upper Austria University of Applied Sciences (Media Technology and Design) Key words: Mixed Reality, Augmented Reality, Education, Future Lab Abstract: Augmented

More information

Virtual Object Manipulation on a Table-Top AR Environment

Virtual Object Manipulation on a Table-Top AR Environment Virtual Object Manipulation on a Table-Top AR Environment H. Kato 1, M. Billinghurst 2, I. Poupyrev 3, K. Imamoto 1, K. Tachibana 1 1 Faculty of Information Sciences, Hiroshima City University 3-4-1, Ozuka-higashi,

More information

Double-side Multi-touch Input for Mobile Devices

Double-side Multi-touch Input for Mobile Devices Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan

More information

Proseminar - Augmented Reality in Computer Games

Proseminar - Augmented Reality in Computer Games Proseminar - Augmented Reality in Computer Games Jan Schulz - js@cileria.com Contents 1 What is augmented reality? 2 2 What is a computer game? 3 3 Computer Games as simulator for Augmented Reality 3 3.1

More information

Testbed Evaluation of Virtual Environment Interaction Techniques

Testbed Evaluation of Virtual Environment Interaction Techniques Testbed Evaluation of Virtual Environment Interaction Techniques Doug A. Bowman Department of Computer Science (0106) Virginia Polytechnic & State University Blacksburg, VA 24061 USA (540) 231-7537 bowman@vt.edu

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems Fumihisa Shibata, Takashi Hashimoto, Koki Furuno, Asako Kimura, and Hideyuki Tamura Graduate School of Science and

More information

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality The MIT Faculty has made this article openly available. Please share how this access benefits you. Your

More information

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106)

More information

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality

Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality Interaction Techniques using Head Mounted Displays and Handheld Devices for Outdoor Augmented Reality by Rahul Budhiraja A thesis submitted in partial fulfillment of the requirements for the Degree of

More information

Collaborative Visualization in Augmented Reality

Collaborative Visualization in Augmented Reality Collaborative Visualization in Augmented Reality S TUDIERSTUBE is an augmented reality system that has several advantages over conventional desktop and other virtual reality environments, including true

More information

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments Virtual Environments 1 A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based Virtual Environments Changming He, Andrew Lewis, and Jun Jo Griffith University, School of

More information

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Klen Čopič Pucihar School of Computing and Communications Lancaster University Lancaster, UK LA1 4YW k.copicpuc@lancaster.ac.uk Paul

More information

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction Joan De Boeck Chris Raymaekers Karin Coninx Limburgs Universitair Centrum Expertise centre for Digital Media (EDM) Universitaire

More information

Vision-Based Interaction A First Glance at Playing MR Games in the Real-World Around Us

Vision-Based Interaction A First Glance at Playing MR Games in the Real-World Around Us Vision-Based Interaction A First Glance at Playing MR Games in the Real-World Around Us Volker Paelke University of Hannover, IKG Appelstraße 9a 30167 Hannover +49 511 762 2472 Volker.Paelke@ikg.uni-hannover.de

More information

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Survey of Mobile Augmentation for Mobile Augmented Reality System A Survey of Mobile Augmentation for Mobile Augmented Reality System Mr.A.T.Vasaya 1, Mr.A.S.Gohil 2 1 PG Student, C.U.Shah College of Engineering and Technology, Gujarat, India 2 Asst.Proffesor, Sir Bhavsinhji

More information

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Makoto Yoda Department of Information System Science Graduate School of Engineering Soka University, Soka

More information

Immersive Well-Path Editing: Investigating the Added Value of Immersion

Immersive Well-Path Editing: Investigating the Added Value of Immersion Immersive Well-Path Editing: Investigating the Added Value of Immersion Kenny Gruchalla BP Center for Visualization Computer Science Department University of Colorado at Boulder gruchall@colorado.edu Abstract

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment Mohamad Shahrul Shahidan, Nazrita Ibrahim, Mohd Hazli Mohamed Zabil, Azlan Yusof College of Information Technology,

More information

The architectural walkthrough one of the earliest

The architectural walkthrough one of the earliest Editors: Michael R. Macedonia and Lawrence J. Rosenblum Designing Animal Habitats within an Immersive VE The architectural walkthrough one of the earliest virtual environment (VE) applications is still

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Combining Multi-touch Input and Device Movement for 3D Manipulations in Mobile Augmented Reality Environments

Combining Multi-touch Input and Device Movement for 3D Manipulations in Mobile Augmented Reality Environments Combining Multi-touch Input and Movement for 3D Manipulations in Mobile Augmented Reality Environments Asier Marzo, Benoît Bossavit, Martin Hachet To cite this version: Asier Marzo, Benoît Bossavit, Martin

More information

CSE 165: 3D User Interaction. Lecture #11: Travel

CSE 165: 3D User Interaction. Lecture #11: Travel CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment

More information

UMI3D Unified Model for Interaction in 3D. White Paper

UMI3D Unified Model for Interaction in 3D. White Paper UMI3D Unified Model for Interaction in 3D White Paper 30/04/2018 Introduction 2 The objectives of the UMI3D project are to simplify the collaboration between multiple and potentially asymmetrical devices

More information

Understanding OpenGL

Understanding OpenGL This document provides an overview of the OpenGL implementation in Boris Red. About OpenGL OpenGL is a cross-platform standard for 3D acceleration. GL stands for graphics library. Open refers to the ongoing,

More information

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS NSF Lake Tahoe Workshop on Collaborative Virtual Reality and Visualization (CVRV 2003), October 26 28, 2003 AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS B. Bell and S. Feiner

More information

3D Data Navigation via Natural User Interfaces

3D Data Navigation via Natural User Interfaces 3D Data Navigation via Natural User Interfaces Francisco R. Ortega PhD Candidate and GAANN Fellow Co-Advisors: Dr. Rishe and Dr. Barreto Committee Members: Dr. Raju, Dr. Clarke and Dr. Zeng GAANN Fellowship

More information

Collaborative Interaction through Spatially Aware Moving Displays

Collaborative Interaction through Spatially Aware Moving Displays Collaborative Interaction through Spatially Aware Moving Displays Anderson Maciel Universidade de Caxias do Sul Rod RS 122, km 69 sn 91501-970 Caxias do Sul, Brazil +55 54 3289.9009 amaciel5@ucs.br Marcelo

More information

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,

More information