Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Size: px
Start display at page:

Download "Occlusion based Interaction Methods for Tangible Augmented Reality Environments"

Transcription

1 Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α, Mark illinghurst β and Gerard Jounghyun Kim α α Virtual Reality Laboratory, Dept. of CSE, POSTECH, Pohang, , Republic of Korea β HIT Lab NZ, University of Canterbury, Private bag 4800, Christchurch, New Zealand Abstract Traditional Tangible Augmented Reality (Tangible AR) interfaces combine a mixture of tangible user interface and augmented reality technology, complementing each other for novel interaction methods and real world anchored visualization. However, well known conventional one and two dimensional interaction methods such as pressing buttons, changing slider values, or menu selections are often quite difficult to apply to Tangible AR interfaces. In this paper we suggest a new approach, occlusion based interaction, in which visual occlusion of physical markers are used to provide intuitive two dimensional interaction in Tangible AR environments. We describe how to implement occlusion based interfaces for Tangible AR environments, give several examples of applications and describe results from informal user studies. Keywords: tangible augmented reality, user interface, occlusion, augmented reality, computer human interaction CCS Categories: H.5.2 [Information Interfaces and Presentation]: User Interfaces Interaction styles; I.3.6 [Computer Graphics]: Methodology and Techniques Interaction techniques 1 Introduction Augmented Reality (AR) interfaces involve the overlay of virtual imagery on the real world. Over the past decade there has been an evolution in the types of AR interfaces being developed. The earliest systems were used to view virtual models in a variety of application domains such as medicine and machine maintenance. These interfaces provided a very intuitive method for viewing three-dimensional virtual information, but little support for creating or modifying the AR content. More recently, researchers have begun to address this deficiency. The AR modeler of Kiyokawa et al. [1999] uses a magnetic tracker to allow people to create AR contents, while the Studierstube [Szalavári and Gervautz 1997] project uses a pen and tablet for selecting and modifying AR objects. However, there is still a need for more intuitive interaction techniques. We have been developing a new approach to designing AR Copyright 2004 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) or permissions@acm.org ACM /04/0006 $5.00 Long Paper 419 interfaces that we refer to as Tangible Augmented Reality [Kato et al. 2001] (Tangible AR). Tangible AR interfaces are those in which 1) each virtual object is registered to a physical object and 2) the user interacts with virtual objects by manipulating the corresponding physical objects. The physical objects and interactions are equally as important as the virtual imagery and provide a very intuitive way to interact with the AR interface. For example, in our Shared Space [illinghurst et al. 2000] collaborative AR interface, threedimensional virtual objects appear attached to real playing cards. Several users could manipulate the cards at the same time. When they put related virtual objects next to each other a simple animation is shown. The interface was tested by thousands of users who reported that interaction with the virtual models was very natural and intuitive, and that they could easily collaborate with each other. In a later interface, VOMAR, Kato et al. [2001] showed how more complicated physical interaction techniques could be used to enable a person to arrange virtual furniture in a 3D scene assembly program. Once again, the use of Tangible AR techniques made interaction with the virtual content natural and intuitive. In these interfaces a computer vision library, ARToolKit [ARToolKit], is used to track the pose of a head worn camera relative to physical markers. Real objects can be tagged by these markers and used as interaction widgets in AR interfaces. This allows the development of a wide range of interface objects, such as books that have virtual imagery appearing from the real pages [illinghurst et al. 2001], maps that appear overlaid with virtual terrains [Hedley et al. 2002] or tiles that support rapid prototyping of aircraft cockpits [Poupyrev et al. 2002]. These interfaces provide very natural 3D interaction techniques based on six degree of freedom manipulation of real objects. However there are times when 1D or 2D interaction techniques are required, such as pushing buttons, moving sliders, or menu and icon selections. This type of interaction has not been well studied in a Tangible AR environment. In this paper we suggest a new approach for one and two dimensional interaction in Tangible AR interfaces. Our approach is based on camera-based detection of occlusion of physical markers. Occlusion based interaction is a low cost, easy to implement method for 1D and 2D interactions in Tangible AR environments. In the remainder of this paper we first talk about other related methods for 1D and 2D interaction. We then talk about our approach to occlusion based interaction and present several examples of occlusion based interfaces. Finally we describe feedback from informal user studies and outline directions for future work.

2 2 Related Works Although most AR interfaces are concerned with viewing and interacting with 3D virtual imagery, there are a number of AR interfaces that incorporate 2D interface elements and interaction techniques. In wearable computer interfaces such as Piekarski s Tinmith [Piekarski and Thomas 2002] system, 2D interface elements are commonly aligned with the user s viewpoint. In this case, glove based gestures were used to select 2D menu items displayed on the screen space. Similarly, Dias et al. suggested the use of 3D pointers for 2D interactions in their MagicRing [Dias et al. 2003] system. Visual markers attached to rings or bracelets were used to recognize 3D gestures of the user. esides its 3D usage, they also proposed to display 2D interfaces, e.g. a menu system, in a screen stabilized manner and use the 2D projected points of the markers as 2D pointers. In contrast, Feiner s Windows on the World interface [Feiner et al. 1993] overlays 2D X-windows over the real world in a worldstabilized manner. The windows appeared to float in space relative to magnetic trackers while the user was able to interact with them using traditional mouse and keyboard input. More recently, the ARGUI [Geiger et al. 2003] system allowed AR applications to be built on a real surface. Virtual 2D windows were attached to tracking markers and the interaction was with a normal mouse and keyboard. Mouse motions over the AR view were translated to 2D input in the plane of the real surface. In this way traditional 2D applications could be viewed and interacted with in an AR setting. In these interfaces, mouse, keyboard or other indirect input methods were used to interact with the 2D interface elements. However, in Tangible AR interfaces, seamless interaction between real and virtual elements is a key design principle. In our work we want to enable the user to directly interact with the virtual interface elements using direct touch rather than indirectly through mouse and keyboard input. Sensing touch is a well known method for interacting with 2D surfaces, and there are also special devices for this purpose, such as touch screen or touch pad. Although there are some interfaces which use large scale touch sensors [Rekimoto 2002], they are still expensive (compared to their working area) and not widely available. Furthermore, it is even harder to apply them into mobile AR environment which is one of the most promising application areas for Tangible AR interfaces. As an alternative approach, we suggest using occlusion as an interaction tool. In non-ar interfaces, occlusion is commonly used to detect touch input. For example, the Canesta virtual keyboard [Canesta] is a keyboard projected over the real world. A depth sensor is integrated into the projector that detects when a user occludes the virtual keys. Parapara paradise [Konami], a commercial arcade dance game from KONAMI Computer Entertainment, also uses occlusion as an interaction method. In order to detect whether the participants limbs are in a proper position, the dance game machine radiates a number of infrared beams and checks if the infrared beams are occluded by the participant s body parts. There were also early attempts to use occlusion as an interaction method within augmented reality environments [illinghurst et al. 2000; Dias et al. 2003; Poupyrev et al. 2000]. In these cases, occlusion was used as a simple way of finishing interactions by hiding the formal markers being tracked. In contrast, in our work we implement more complicated occlusion based interactions. McDonald and Roth [2003] used occlusions for acquiring a blob of hand image in order to combine the traditional hand gesture recognition with 2D augmented interfaces. They subtracted the known 2D tracking pattern from the camera image to acquire a blob of fingers. Although they used a relatively robust marker tracking method to tolerate partial occlusions of the marker, it still was not sufficient to handle a large amount of occlusion (at least half of the corner points on the tracking pattern were needed for a successful tracking). This limited the interaction to a small number of finger gestures: pointing with a single finger blob and selecting when the blob split into two. In the next section we show how occlusion based interaction can be implemented in AR interfaces. Occlusion based input is an easy way to support one and two dimensional interaction methods, especially within Tangible AR environments. We review these methods and then describe several occlusion based interfaces. 3 Occlusion based Interaction Two dimensional interactions usually have a pointer on an interaction surface, and the users are provided with an interface tool, such as a mouse or a tablet pen, to move this pointer. When a user moves the pointer on the interaction surface, the object or place over which the pointer lies is determined as the interaction point. There are two approaches to 2D interactions: a pointer centered view and an interaction object centered view. In the pointer centered view, the system tracks the movement of a single pointer and checks whether there is an interaction object beneath the pointer. This approach works well in a traditional desktop graphical user interface. However, it is not easy to apply the same method to Tangible AR environments where natural interaction methods are vital. In the real world, humans are able to use a variety of objects or even their bare fingers as a pointer. In addition, for some situations such as having multiple participants, or using bi-manual interactions, interaction can even involve multiple pointers. Tangible AR interfaces should support these types of input. User interaction can also be thought of in an object centered manner. Interaction points are usually predefined and their regions are known, so the pointer doesn t have to be tracked all the time; detecting whenever it is over an interaction object (or point) is sufficient. This approach is especially useful for Tangible AR environments where real world objects are used as a pointer and additional visualizations of pointers are not necessary. Since the pointers are not continuously tracked, multiple pointers can be treated in the same way as when only a single pointer is in use. Detecting pointers over an interaction object can be achieved in numerous ways. Active sensors, such as infrared sensors, ultra sound range sensors, or even a camera, can be placed on every interaction point. In comparison, detecting occlusions of tracked objects is a passive way to detect pointing actions. Occlusions of interaction objects can be easily utilized as an interaction method in Tangible AR environments in which a camera is already available for providing real world views to the users, and for tracking the objects of interest with passive formal markers. 420

3 In the remaining part of this section we describe how to detect occlusions in Tangible AR environments, and how to utilize these occlusions for interactions. 3.1 Occlusion Detection Predefined formal markers are widely used for tracking real objects in Tangible AR environments. Although current vision technologies can provide robust marker tracking, they can occasionally fail due to bad lighting conditions, motion blurs, marker deformations or occlusions. To avoid this problem, vision based tracking systems usually use multiple markers for tracking one object. A number of markers are attached on a single object in a pre-configured spatial relationship. In this way the object can be tracked successfully even if some of the markers in the marker set are not visible. In addition because the spatial relationships of all the markers are known, the poses of markers that are not visible can be estimated using the markers that are recognized. y knowing the poses of undetected markers, we can infer the reasons why the tracking failed. If the tracking markers are placed on a rigid body surface, tracking failures of some of the markers is mainly due to the marker being out of the view or being occluded by other objects, rather than unaffordable lighting conditions or motion blurs. Lighting conditions and motion blurs affect globally and cause tracking failure of all markers in the marker set, but not a portion of it. Deformation of markers can also cause partial tracking failure, so we assume that markers are attached to a rigid body surface to prevent easy deformations. Marker tracking failure is reduced to two cases (out of view and occlusion), so to detect occlusions, we only need to check if an undetected marker is within the view volume. We use two methods for checking view dependent visibilities of the marker; boundary marker and marker projection methods The oundary Marker Method A simple way to guarantee that a marker is within the view volume is to check the visibility of its neighboring markers. We refer to these neighbor markers as boundary markers, and a marker being checked for occlusion as an interaction marker. To guarantee that an interaction marker is within the view volume, boundary markers must be carefully placed. The convex hull of boundary markers must include the interaction marker. For instance, for a single interaction marker, we need at least 2 boundary markers surrounding the interaction marker (see Figure 1). y checking if these boundary markers are visible, the interaction marker can be guaranteed to be within the view volume; hence, it is occluded if it is not detected. I I I Figure 1: oundary markers around interaction markers When multiple interaction markers are placed in a line, neighbors of the interaction marker being tested can also be treated as boundary markers. We refer to these markers as hybrid markers (see Figure 2). The tested marker is within the view volume whenever there is at least one visible boundary (or hybrid) marker on each side. Hybrid markers act as both boundaries and an interaction point itself. In this way, occlusion of multiple consecutive markers can also be detected, as well as allowing the boundary markers to be out of the view. viewing region H H H Figure 2: Hybrid markers: the center hybrid marker plays the role of boundary marker for the left hybrid marker. A set of markers can also be arranged in a grid, forming a 2D matrix of markers. In this case, we can address the visibility in the same way as with the linear marker configuration by grouping markers into linear forms in four directions: horizontal, vertical and two diagonal directions (see Figure 3). Thus, the corner markers act as the boundary markers and the rest are hybrid markers. H H H H H Figure 3: A 2D set of markers Although the boundary marker method is simple to implement and works reliably, marker wastage is unavoidable since additional non-interactable boundary markers are required. Furthermore, interaction is a little difficult because the user has to make sure that enough boundary markers are within the current view. To overcome these problems, we introduce a second method for checking the visibility of markers The Estimated Marker Projection Method The spatial relationships of markers within a marker set are known, so as the marker set is being tracked, the 3D position and orientation of invisible markers relative to the camera can be estimated. Once the 3D pose of an invisible marker is estimated, its 2D projection on the screen can also be predicted (assuming that the camera is carefully calibrated and the internal parameters of the camera are known). We can conclude that the marker is within the view volume by simply checking if the entire projected image of the estimated invisible marker lies within the viewport (see Figure 4). H H H H H H H H H H H H H H H 421

4 Figure 4: The estimated marker projection method: with the visible green marker, projected images of red and blue markers can be predicted and used for the visibility test. Compared to the boundary marker method, the estimated marker projection method needs just one visible marker from the marker set, to check the visibility and occlusion of undetected markers. As a result, the users have to keep only one more marker within view in addition to the marker s/he is interacting with, so interaction is easier In addition, since all the markers in the marker set can act as an interaction point, no markers are wasted. Incorrect estimation is one of the problems of the estimated marker projection method. There are two main causes: incorrect tracking of the visible markers, and incorrect projection of the estimated marker due to camera lens distortion. The incorrect projection problem can be compensated for by overestimating the region of the marker being projected. Even after a careful camera calibration, the lens distortion can t be fully eliminated, so the projected image of the estimated marker can have different regions than that of the actual image from the camera. y treating a marker as if it is bigger than its actual size, the projected region of the estimated marker can cover the actual region and prevent invisible markers being falsely identified as occluded ones. 3.2 Interaction Design Once we have a method for reliably sensing marker occlusion, we can use this to build novel interaction methods. We can use occlusions for sparse, low granularity position sensing, but by applying additional processes, we can achieve more useful interactions. In this section, we introduce additional techniques which make occlusion based interactions more useful Time out constraints A time out constraint is a simple but useful method that can be added to occlusion based interactions. The main use of the time out constraint is to support explicit selection. Although pointing can be used to select objects, it is easy to accidentally point to incorrect objects, since objects are scattered over the interaction surface. However by using a time out constraint, users can reliably select objects, by staying above the object for a specific time period. Time out constraints can also be used for repeating discrete input events. Repeating the same input is useful for some interfaces, such as up-down arrows for changing numerical values, or for scroll wheels. For these interfaces, instead of making users repeatedly point, input events can be automatically repeated with a certain time period Sub-marker level measurement The measuring granularity of the occlusion is basically on a per marker basis, but under certain circumstances, sub-marker level measurements are also possible. For example, when a set of markers are aligned in a line and two consecutive markers are occluded, we can presume that the user is pointing to a point between two markers. This is especially useful for those interfaces dealing with continuous input values, such as sliders and scroll wheels, since the users are able to express finer values by placing the pointer between multiple markers. Sub-marker level measurement can also contribute to reducing the number of markers needed. Since the measurement is made with a finer granularity, the number of markers needed for certain number of input levels could be reduced down to almost half Tip point marker detection When markers are aligned in a 2D grid form, unintended occlusions of markers may happen in addition to the one which the user is actually pointing on (see Figure 5). To avoid interpreting these unintended occlusions as interactions, we must find the marker which the user is actually pointing to (a.k.a. the tip point marker). Figure 5: Tip point detection problem: from the blob of occluded markers, it is difficult to tell where the user is actually pointing to. To solve this problem, a heuristic method was applied; selecting the top-left marker from among the occluded marker set. Since the users arm usually appears from the bottom right side of the view and approaches to top left direction (for the right-handed users), selecting the top-left marker worked well for most of the situations, especially when users wore a head mounted display with camera attached. Other traditional vision techniques such as calculating principal axis of the occlusion blob could also be adopted to find the tip point of the marker set. However, selecting the top-left marker was sufficient for the current implementation. 4 Implementation The development and testing of occlusion based AR interfaces was carried out on a consumer level PC. The PC was running Windows XP on an Athlon 1.5GHz processor with 512M main memory, and a 3D graphics board with NVIDIA GeForce4 MX chipset. We also ported the system into a Macintosh iook laptop computer for future use in mobile AR or wearable computing environments. The iook was running MacOS X on a G3 900MHz processor with 640M main memory and an ATI Radeon D graphics board. There are various computer vision methods for detecting and tracking 3D positions and orientations of square markers. In this study, we used the ARToolKit [ARToolKit] library, a well known 422

5 computer vision library for detecting and tracking the 3D pose of square markers relative to the camera. For the image capture device, a Logitech US web camera was used with a shutter speed of 30 frames per second. The capturing resolution was set to 320 by 240 and the image was stretched to fit the full screen. However, OpenGL graphics were drawn in full resolution. This camera was mounted on an i-glasses i-visor headmounted display with an 800x600 pixel resolution. In head mounted display the user sees video of the real world with computer graphics overlaid on it. This is commonly called video see-through Augmented Reality. The PC platform achieved frame rates between 19 to 30 frames per second (fps), depending on the number of markers being tracked. The iook averaged 20 fps, except for the 2D grid configuration where the frame rate dropped down to 5 fps when all 35 markers were within the view. Using ARToolKit there were occasional tracking failures. To make sure that these did not affect the system, time filtering was used. Only markers that were undetected for several consecutive frames were identified as occluded. 5 Applications In order to test whether occlusion based interaction techniques worked well, we developed various test applications. The sequence of images in Figure 6 shows how users could interact with a Tangible AR button. The top left image shows a marker set printed on a sheet of paper, making it possible to track the paper and overlay a virtual button on it (top right). Whenever the user touches the virtual button, occluding the center marker, the button changes its color from blue to yellow. To prevent false input, a time out constraint was applied, and the button was colored red when it was occluded. We also applied the same technique to a 3D mouse (bottom right) which gives button pressing feedback as well as measuring the three-dimensional pose of the mouse. Users are able to hold the mouse in one hand and press the buttons on it by swiveling their thumbs. The pressed virtual mouse buttons also changed their colors from blue to red, giving the visual feedback of clicking to the user. The 3D pose of the mouse could be used to rotate or translate a separate virtual object. With a marker set configured in a 1D linear form, we implemented a Tangible AR slider. Figure 7 shows the marker set used for the Tangible AR slider and the virtual slider overlaid on it. In this case the slider is implemented with the marker projection method for occlusion detection, allowing the users to interact with the full set of markers (left bottom). The left image in the middle row shows the sub-marker level measurement and the right image shows a user interacting with the slider using a pen instead of his/her hand. The last image shows how the slider is used for input in a Tangible AR kaleidoscope application. Here users can watch the changing patterns, formed by reflections of tiny colored pieces that are falling down as they rotate the kaleidoscope marker. The Tangible AR slider was used for changing the number of mirrors inside the kaleidoscope, from 3 to 9. Six markers with the boundary marker occlusion detection method and sub-marker level measurement were used in this case. Figure 6: Tangible AR button and 3D mouse Figure 7: Tangible AR slider and kaleidoscope Figure 8 shows another application of a 1D linear marker configuration: a Tangible AR menu bar. The menu bar showed a number of virtual objects with various shapes and colors, and the object chosen by the user was displayed on a separate marker plate. To show more virtual items than the number of markers used, the menu items were scrollable (bottom left) by selecting the scroll arrows. Similar to the Tangible AR button, the time out constraint was used for the menu item selection. When items are occluded their size is increased and they are rotated to provide 423

6 visual feedback (middle row). The last image shows a user operating the menu bar with the marker plate instead of his/her fingers. Figure 10: Tangible AR calculator and tic-tac-toe game A 2D grid of markers can also be used without the tip point detections. Figure 11 shows a simple game in which users can push the virtual balls with their bare hands. A vector field flowing from the occluded region to the non-occluded is calculated (drawn in red lines) and the ball movements are accelerated according to this vector field. Figure 8: Tangible AR menu bar Using a 2D grid of markers, a Tangible AR board game application (see Figure 9) was built where users can drag and drop virtual objects over the board. The tip point of the blob of occluded markers (colored in green), was found by a simple heuristic method, looking for the top left occluded marker. Objects were selected and dropped by using time out constraints. Occluded markers were drawn in a semi-transparent fashion to allow users to see their hands together with the virtual objects on the board. Figure 9: Tangible AR board game In addition to these applications, occlusion based interaction technique could be applied to a wide variety of useful interfaces. For example, sliders could be modified to represent wheels for scrolling tasks and controlling single axis rotations. Menu bars could also be extended into hierarchical menus by grouping a number of menu bars. Through other modifications of the Tangible AR board, we can also build an alphanumeric keypad or a simple calculator (see Figure 10). Figure 11: Pushing the virtual balls with occlusion based interaction. 6 Discussion Informal user studies with various applications revealed that occlusion based interaction methods were simple, intuitive and natural to use within the Tangible AR environment. We gathered user comments from 6 subjects on using the Tangible AR menu bar application. On a 7-scale (0~6) questionnaire about how easy it was to learn and use, most users gave high scores, average of 5.6 (σ=0.55) for learning and 5 (σ=0.7) for ease using. The ball pushing game, the tic-tac-toe (see Figure 10) and the Tangible AR calculator were demonstrated on a public. More than 100 peoples tried them and gave positive feedbacks on its ease learn and use. Some of the early users reported feeling odd with the virtual objects drawn over their hands, especially on the 2D grid of markers. In order to reduce this problem, the virtual objects on the occluded markers were drawn in a semi-transparent style. The most notable strength of occlusion based interaction is naturalness. Users in Tangible AR environments can interact with virtual objects using real world physical objects, touching, holding, and manipulating them directly with their hands [Kato et al. 2001]. Occlusion based interaction provides a natural and intuitive way for 2D interaction by building on touching or pointing actions that are naturally used in our every day lives. The naturalness of the interaction is enhanced by allowing the use of bare hands for interaction in comparison to other approaches [Dias et al. 2003; Feiner et al. 1993; Geiger et al. 2003]. The users of occlusion based interface are not required to wear or hold 424

7 anything, allowing the users to act in the same way as they do in the real world. Direct manipulation is another aspect that adds naturalness to occlusion based interaction. Traditional Tangible AR interfaces use tightly coupled, yet indirect ways of manipulating virtual objects: users have to manipulate physical objects in order to interact with the virtual objects attached to them. In contrast, occlusion based interaction allows the users to directly point to virtual objects when they interact with them. Gesture recognition with skin color detection or background subtraction is another way of supporting bare hand interaction with virtual content. However occlusion based interfaces still have advantages in terms of the relatively lower cost of development and lower computation power needed for processing. In addition, occlusion based interfaces are also robust to using different pointers, e.g. using pens or other objects instead of bare hands, and users need not learn specific gestures to use the interface. Passive haptic and tactile feedback is another advantage of occlusion based interfaces over the skin color detection approach. y touching and rubbing a physical surface, users can provide more precise input since the physical surface gives a frame of reference for 2D manipulation. Since our proposed approach only needs passive visual markers which can be easily printed on a paper, numbers of occlusion based interfaces can be installed cheaply in various places. y this ubiquitous nature, it has a strong potential for use in mobile augmented reality systems or wearable computing environments. Various attempts to apply vision based tracking in wearable computing systems [Piekarski and Thomas 2002; Pintaric 2003] could accommodate occlusion based interfaces. Occlusion based interfaces do have a limitation of view dependent interaction, i.e. the interface works only when it is within the view. However, in practice this limitation does not cause severe problems since users usually look at the interface surface before they interact with it. Additional cameras could be used to allow interaction with occlusion based interfaces that are out of the users view. 7 Conclusion and Future Works In spite of its coarse granularity of interaction, occlusion based interaction is useful for 2D interactions in tangible augmented reality environments, providing simple, easy to use and natural interaction methods with low development cost and computing power requirements. Although the ARToolKit markers were useful enough for this study, in order to achieve finer occlusion sensing, other types of visual markers would be needed. In the future we intend exploring the use of natural texture feature tracking methods [Kato et al. 2003] that allow any image to be used for vision based tracking. We also intend conducting formal usability studies to compare these techniques with traditional AR input methods and investigating how advanced haptic and tactile feedback could be added. y solving other common problems in vision based AR systems, such as occasional failures with marker detection and tracking, occlusion based interfaces could become one of the more promising 2D interaction methods for augmented reality environments. Acknowledgments We appreciate the HIT Lab NZ, who provided the research environment for this work. We also acknowledge the Korean Ministry of Education s rain Korea 21 program and the Korean Ministry of Information and Communication s ITRC program for their support for this joint research between HIT Lab NZ, University of Canterbury and VR Lab at POSTECH. References ARTOOLKIT. ILLINGHURST, M., POUPYREV, I., KATO, H. and MAY, R Mixing Realities in Shared Space: An Augmented Reality Interface for Collaborative Computing. In Proceedings of ICME 2000, IEEE, ILLINGHURST, M., KATO, H. and POUPYREV, I The Magicook - Moving Seamlessly between Reality and Virtuality. IEEE Computer Graphics and Applications 21, 3, 6-8. CANESTA. Canesta keyboard. DIAS, J. M. S., SANTOS, P. and NANDE, P In Your Hand Computing: Tangible Interfaces for Mixed Reality. In Proceedings CD of 2nd IEEE International Augmented Reality Toolkit Workshop, Waseda Univ., Tokyo, Japan. FEINER, S., MACINTYRE,., HAUPT, M. and SOLOMON, E Windows on the world: 2D windows for 3D augmented reality. In Proceedings of ACM Symposium on User interface software and technology (UIST), Atlanta, Georgia, U.S.A., ACM, GEIGER, C., OPPERMANN, L. and REIMANN, C D-Registered Interaction-Surfaces in Augmented Reality Space. In Proceedings CD of 2nd IEEE International Augmented Reality Toolkit Workshop, Waseda Univ., Tokyo, Japan. HEDLEY, N., ILLINGHURST, M., POSTNER, L., MAY, R. and KATO, H Explorations in the use of Augmented Reality for Geographic Visualization. Presence 11, 2, MIT Press, KATO, H., ILLINGHURST, M., POUPYREV, I., IMAMOTO, K. and TACHIANA, K Virtual Object Manipulation on a Table-Top AR Environment. In Proceedings of the International Symposium on Augmented Reality (ISAR 2000), Munich, Germany, KATO, H., ILLINGHURST, M., POUPYREV, I., TETSUTANI, N. and TACHIANA, K Tangible Augmented Reality for Human Computer Interaction. In Proceedings of Nicograph 2001, Nagoya, Japan. KATO, H., TACHIANA, K., ILLINGHURST, M. and GRAFE, M A Registration Method based on Texture Tracking using ARToolKit. In Proceedings CD of 2nd IEEE International Augmented Reality Toolkit Workshop, Waseda Univ., Tokyo, Japan. KIYOKAWA, K., TAKEMURA, H. and YOKOYA, N A Collaboration Support Technique by Integrating a Shared Virtual Reality and a Shared Augmented Reality. In Proceedings of IEEE International Conference on Systems Man and Cybernetics (SMC 99), 6, Tokyo, Japan, IEEE, KONAMI. Para-Para-Paradise game machine. MCDONALD, C. and ROTH, G Replacing a Mouse with Hand Gesture in a Pland-ased Augmented Reality System. In Proceedings of 16th International Conference on Vision Interface, Halifax, Canada. PIEKARSKI, W. and THOMAS,. H Using ARToolKit for 3D Hand Position Tracking in Mobile Outdoor Environments. In Proceedings CD of 1st International Augmented Reality Toolkit Workshop, Darmstadt, Germany. PINTARIC, T An Adaptive Thresholding Algorithm for the Augmented Reality Toolkit. In Proceedings CD of 2nd IEEE 425

8 International Augmented Reality Toolkit Workshop, Waseda Univ., Tokyo, Japan. POUPYREV, I., ERRY, R., KURUMISAWA, J., NAKAO, K. and ILLINGHURST, M Augmented Groove: Collaborative Jamming in Augmented Reality. In Proceedings of ACM SIGGRAPH 2000 Conference Abstracts and Applications, ACM Press / ACM SIGGRAPH, Computer Graphics Proceedings, Annual Conference Series, ACM, 77. POUPYREV, I., TAN, D. S., ILLINGHURST, M., KATO, H., REGENRECHT, H. and TETSUTANI, N Developing a Generic Augmented Reality Interface. IEEE Computer 35, 3, REKIMOTO, J SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces. In Proceedings of CHI 02, Minneapolis, Minnesota, U.S.A., ACM, SZALAVÁRI, Z. and GERVAUTZ, M The Personal Interaction Panel A Two-Handed Interface for Augmented Reality. In Proceedings of EUROGRAPHICS 97, Computer Graphics Forum, 16, 3,

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

VIRTUAL REALITY AND SIMULATION (2B)

VIRTUAL REALITY AND SIMULATION (2B) VIRTUAL REALITY AND SIMULATION (2B) AR: AN APPLICATION FOR INTERIOR DESIGN 115 TOAN PHAN VIET, CHOO SEUNG YEON, WOO SEUNG HAK, CHOI AHRINA GREEN CITY 125 P.G. SHIVSHANKAR, R. BALACHANDAR RETRIEVING LOST

More information

Immersive Authoring of Tangible Augmented Reality Applications

Immersive Authoring of Tangible Augmented Reality Applications International Symposium on Mixed and Augmented Reality 2004 Immersive Authoring of Tangible Augmented Reality Applications Gun A. Lee α Gerard J. Kim α Claudia Nelles β Mark Billinghurst β α Virtual Reality

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

Virtual Object Manipulation using a Mobile Phone

Virtual Object Manipulation using a Mobile Phone Virtual Object Manipulation using a Mobile Phone Anders Henrysson 1, Mark Billinghurst 2 and Mark Ollila 1 1 NVIS, Linköping University, Sweden {andhe,marol}@itn.liu.se 2 HIT Lab NZ, University of Canterbury,

More information

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment

An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment An Implementation Review of Occlusion-Based Interaction in Augmented Reality Environment Mohamad Shahrul Shahidan, Nazrita Ibrahim, Mohd Hazli Mohamed Zabil, Azlan Yusof College of Information Technology,

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Augmented Reality- Effective Assistance for Interior Design

Augmented Reality- Effective Assistance for Interior Design Augmented Reality- Effective Assistance for Interior Design Focus on Tangible AR study Seung Yeon Choo 1, Kyu Souk Heo 2, Ji Hyo Seo 3, Min Soo Kang 4 1,2,3 School of Architecture & Civil engineering,

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Tangible Augmented Reality

Tangible Augmented Reality Tangible Augmented Reality Mark Billinghurst Hirokazu Kato Ivan Poupyrev HIT Laboratory Faculty of Information Sciences Interaction Lab University of Washington Hiroshima City University Sony CSL Box 352-142,

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Application of 3D Terrain Representation System for Highway Landscape Design

Application of 3D Terrain Representation System for Highway Landscape Design Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented

More information

Virtual Object Manipulation on a Table-Top AR Environment

Virtual Object Manipulation on a Table-Top AR Environment Virtual Object Manipulation on a Table-Top AR Environment H. Kato 1, M. Billinghurst 2, I. Poupyrev 3, K. Imamoto 1, K. Tachibana 1 1 Faculty of Information Sciences, Hiroshima City University 3-4-1, Ozuka-higashi,

More information

3D Interaction Techniques

3D Interaction Techniques 3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Study of the touchpad interface to manipulate AR objects

Study of the touchpad interface to manipulate AR objects Study of the touchpad interface to manipulate AR objects Ryohei Nagashima *1 Osaka University Nobuchika Sakata *2 Osaka University Shogo Nishida *3 Osaka University ABSTRACT A system for manipulating for

More information

MRT: Mixed-Reality Tabletop

MRT: Mixed-Reality Tabletop MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having

More information

Advanced Interaction Techniques for Augmented Reality Applications

Advanced Interaction Techniques for Augmented Reality Applications Advanced Interaction Techniques for Augmented Reality Applications Mark Billinghurst 1, Hirokazu Kato 2, and Seiko Myojin 2 1 The Human Interface Technology New Zealand (HIT Lab NZ), University of Canterbury,

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Beyond: collapsible tools and gestures for computational design

Beyond: collapsible tools and gestures for computational design Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published

More information

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Klen Čopič Pucihar School of Computing and Communications Lancaster University Lancaster, UK LA1 4YW k.copicpuc@lancaster.ac.uk Paul

More information

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane Journal of Communication and Computer 13 (2016) 329-337 doi:10.17265/1548-7709/2016.07.002 D DAVID PUBLISHING Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

More information

VICs: A Modular Vision-Based HCI Framework

VICs: A Modular Vision-Based HCI Framework VICs: A Modular Vision-Based HCI Framework The Visual Interaction Cues Project Guangqi Ye, Jason Corso Darius Burschka, & Greg Hager CIRL, 1 Today, I ll be presenting work that is part of an ongoing project

More information

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL

Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL Design and Development of a Marker-based Augmented Reality System using OpenCV and OpenGL Yap Hwa Jentl, Zahari Taha 2, Eng Tat Hong", Chew Jouh Yeong" Centre for Product Design and Manufacturing (CPDM).

More information

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.

More information

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Multimodal Interaction Concepts for Mobile Augmented Reality Applications Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl

More information

X11 in Virtual Environments ARL

X11 in Virtual Environments ARL COMS W4172 Case Study: 3D Windows/Desktops 2 Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 February 8, 2018 1 X11 in Virtual

More information

Sensing Human Activities With Resonant Tuning

Sensing Human Activities With Resonant Tuning Sensing Human Activities With Resonant Tuning Ivan Poupyrev 1 ivan.poupyrev@disneyresearch.com Zhiquan Yeo 1, 2 zhiquan@disneyresearch.com Josh Griffin 1 joshdgriffin@disneyresearch.com Scott Hudson 2

More information

Augmented Board Games

Augmented Board Games Augmented Board Games Peter Oost Group for Human Media Interaction Faculty of Electrical Engineering, Mathematics and Computer Science University of Twente Enschede, The Netherlands h.b.oost@student.utwente.nl

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

Gesture Recognition with Real World Environment using Kinect: A Review

Gesture Recognition with Real World Environment using Kinect: A Review Gesture Recognition with Real World Environment using Kinect: A Review Prakash S. Sawai 1, Prof. V. K. Shandilya 2 P.G. Student, Department of Computer Science & Engineering, Sipna COET, Amravati, Maharashtra,

More information

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane Makoto Yoda Department of Information System Science Graduate School of Engineering Soka University, Soka

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Enabling Cursor Control Using on Pinch Gesture Recognition

Enabling Cursor Control Using on Pinch Gesture Recognition Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

ITS '14, Nov , Dresden, Germany

ITS '14, Nov , Dresden, Germany 3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo,

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

An augmented-reality (AR) interface dynamically

An augmented-reality (AR) interface dynamically COVER FEATURE Developing a Generic Augmented-Reality Interface The Tiles system seamlessly blends virtual and physical objects to create a work space that combines the power and flexibility of computing

More information

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Published in the Proceedings of CHI '97 Hiroshi Ishii and Brygg Ullmer MIT Media Laboratory Tangible Media Group 20 Ames Street,

More information

HUMAN COMPUTER INTERFACE

HUMAN COMPUTER INTERFACE HUMAN COMPUTER INTERFACE TARUNIM SHARMA Department of Computer Science Maharaja Surajmal Institute C-4, Janakpuri, New Delhi, India ABSTRACT-- The intention of this paper is to provide an overview on the

More information

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1 Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Subject Description Form. Upon completion of the subject, students will be able to:

Subject Description Form. Upon completion of the subject, students will be able to: Subject Description Form Subject Code Subject Title EIE408 Principles of Virtual Reality Credit Value 3 Level 4 Pre-requisite/ Corequisite/ Exclusion Objectives Intended Subject Learning Outcomes Nil To

More information

Tiles: A Mixed Reality Authoring Interface

Tiles: A Mixed Reality Authoring Interface Tiles: A Mixed Reality Authoring Interface Ivan Poupyrev 1,i, Desney Tan 2,i, Mark Billinghurst 3, Hirokazu Kato 4, 6, Holger Regenbrecht 5 & Nobuji Tetsutani 6 1 Interaction Lab, Sony CSL 2 School of

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests

More information

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device 2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &

More information

Tangible interaction : A new approach to customer participatory design

Tangible interaction : A new approach to customer participatory design Tangible interaction : A new approach to customer participatory design Focused on development of the Interactive Design Tool Jae-Hyung Byun*, Myung-Suk Kim** * Division of Design, Dong-A University, 1

More information

Interaction, Collaboration and Authoring in Augmented Reality Environments

Interaction, Collaboration and Authoring in Augmented Reality Environments Interaction, Collaboration and Authoring in Augmented Reality Environments Claudio Kirner1, Rafael Santin2 1 Federal University of Ouro Preto 2Federal University of Jequitinhonha and Mucury Valeys {ckirner,

More information

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18,   ISSN International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, www.ijcea.com ISSN 2321-3469 AUGMENTED REALITY FOR HELPING THE SPECIALLY ABLED PERSONS ABSTRACT Saniya Zahoor

More information

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009 MIRACLE: Mixed Reality Applications for City-based Leisure and Experience Mark Billinghurst HIT Lab NZ October 2009 Looking to the Future Mobile devices MIRACLE Project Goal: Explore User Generated

More information

AR Tamagotchi : Animate Everything Around Us

AR Tamagotchi : Animate Everything Around Us AR Tamagotchi : Animate Everything Around Us Byung-Hwa Park i-lab, Pohang University of Science and Technology (POSTECH), Pohang, South Korea pbh0616@postech.ac.kr Se-Young Oh Dept. of Electrical Engineering,

More information

Building a bimanual gesture based 3D user interface for Blender

Building a bimanual gesture based 3D user interface for Blender Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background

More information

Face to Face Collaborative AR on Mobile Phones

Face to Face Collaborative AR on Mobile Phones Face to Face Collaborative AR on Mobile Phones Anders Henrysson NVIS Linköping University andhe@itn.liu.se Mark Billinghurst HIT Lab NZ University of Canterbury mark.billinghurst@hitlabnz.org Mark Ollila

More information

Spatial Mechanism Design in Virtual Reality With Networking

Spatial Mechanism Design in Virtual Reality With Networking Mechanical Engineering Conference Presentations, Papers, and Proceedings Mechanical Engineering 9-2001 Spatial Mechanism Design in Virtual Reality With Networking John N. Kihonge Iowa State University

More information

Future Directions for Augmented Reality. Mark Billinghurst

Future Directions for Augmented Reality. Mark Billinghurst Future Directions for Augmented Reality Mark Billinghurst 1968 Sutherland/Sproull s HMD https://www.youtube.com/watch?v=ntwzxgprxag Star Wars - 1977 Augmented Reality Combines Real and Virtual Images Both

More information

Multi-touch Interface for Controlling Multiple Mobile Robots

Multi-touch Interface for Controlling Multiple Mobile Robots Multi-touch Interface for Controlling Multiple Mobile Robots Jun Kato The University of Tokyo School of Science, Dept. of Information Science jun.kato@acm.org Daisuke Sakamoto The University of Tokyo Graduate

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

Guidelines for choosing VR Devices from Interaction Techniques

Guidelines for choosing VR Devices from Interaction Techniques Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

Virtual Touch Human Computer Interaction at a Distance

Virtual Touch Human Computer Interaction at a Distance International Journal of Computer Science and Telecommunications [Volume 4, Issue 5, May 2013] 18 ISSN 2047-3338 Virtual Touch Human Computer Interaction at a Distance Prasanna Dhisale, Puja Firodiya,

More information

Interface Design V: Beyond the Desktop

Interface Design V: Beyond the Desktop Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI

More information

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung, IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15)

Outline. Paradigms for interaction. Introduction. Chapter 5 : Paradigms. Introduction Paradigms for interaction (15) Outline 01076568 Human Computer Interaction Chapter 5 : Paradigms Introduction Paradigms for interaction (15) ดร.ชมพ น ท จ นจาคาม [kjchompo@gmail.com] สาขาว ชาว ศวกรรมคอมพ วเตอร คณะว ศวกรรมศาสตร สถาบ นเทคโนโลย

More information

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality The MIT Faculty has made this article openly available. Please share how this access benefits you. Your

More information

Projection Based HCI (Human Computer Interface) System using Image Processing

Projection Based HCI (Human Computer Interface) System using Image Processing GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane

More information

Wi-Fi Fingerprinting through Active Learning using Smartphones

Wi-Fi Fingerprinting through Active Learning using Smartphones Wi-Fi Fingerprinting through Active Learning using Smartphones Le T. Nguyen Carnegie Mellon University Moffet Field, CA, USA le.nguyen@sv.cmu.edu Joy Zhang Carnegie Mellon University Moffet Field, CA,

More information

Spatial augmented reality to enhance physical artistic creation.

Spatial augmented reality to enhance physical artistic creation. Spatial augmented reality to enhance physical artistic creation. Jérémy Laviole, Martin Hachet To cite this version: Jérémy Laviole, Martin Hachet. Spatial augmented reality to enhance physical artistic

More information

Telling What-Is-What in Video. Gerard Medioni

Telling What-Is-What in Video. Gerard Medioni Telling What-Is-What in Video Gerard Medioni medioni@usc.edu 1 Tracking Essential problem Establishes correspondences between elements in successive frames Basic problem easy 2 Many issues One target (pursuit)

More information

GlassSpection User Guide

GlassSpection User Guide i GlassSpection User Guide GlassSpection User Guide v1.1a January2011 ii Support: Support for GlassSpection is available from Pyramid Imaging. Send any questions or test images you want us to evaluate

More information

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,

More information

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User

More information

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch 1 2 Research Topic TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY Human-Computer Interaction / Natural User Interface Neng-Hao (Jones) Yu, Assistant Professor Department of Computer Science National

More information

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Augmented Keyboard: a Virtual Keyboard Interface for Smart glasses

Augmented Keyboard: a Virtual Keyboard Interface for Smart glasses Augmented Keyboard: a Virtual Keyboard Interface for Smart glasses Jinki Jung Jinwoo Jeon Hyeopwoo Lee jk@paradise.kaist.ac.kr zkrkwlek@paradise.kaist.ac.kr leehyeopwoo@paradise.kaist.ac.kr Kichan Kwon

More information

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING ABSTRACT Chutisant Kerdvibulvech Department of Information and Communication Technology, Rangsit University, Thailand Email: chutisant.k@rsu.ac.th In

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

SolidWorks Tutorial 1. Axis

SolidWorks Tutorial 1. Axis SolidWorks Tutorial 1 Axis Axis This first exercise provides an introduction to SolidWorks software. First, we will design and draw a simple part: an axis with different diameters. You will learn how to

More information

Annotation Overlay with a Wearable Computer Using Augmented Reality

Annotation Overlay with a Wearable Computer Using Augmented Reality Annotation Overlay with a Wearable Computer Using Augmented Reality Ryuhei Tenmokuy, Masayuki Kanbara y, Naokazu Yokoya yand Haruo Takemura z 1 Graduate School of Information Science, Nara Institute of

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Classifying 3D Input Devices

Classifying 3D Input Devices IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard

More information

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Tangible User Interface for CAVE TM based on Augmented Reality Technique Tangible User Interface for CAVE TM based on Augmented Reality Technique JI-SUN KIM Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of

More information

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7

More information

Microsoft Scrolling Strip Prototype: Technical Description

Microsoft Scrolling Strip Prototype: Technical Description Microsoft Scrolling Strip Prototype: Technical Description Primary features implemented in prototype Ken Hinckley 7/24/00 We have done at least some preliminary usability testing on all of the features

More information

Natural Gesture Based Interaction for Handheld Augmented Reality

Natural Gesture Based Interaction for Handheld Augmented Reality Natural Gesture Based Interaction for Handheld Augmented Reality A thesis submitted in partial fulfilment of the requirements for the Degree of Master of Science in Computer Science By Lei Gao Supervisors:

More information

A TELE-INSTRUCTION SYSTEM FOR ULTRASOUND PROBE OPERATION BASED ON SHARED AR TECHNOLOGY

A TELE-INSTRUCTION SYSTEM FOR ULTRASOUND PROBE OPERATION BASED ON SHARED AR TECHNOLOGY A TELE-INSTRUCTION SYSTEM FOR ULTRASOUND PROBE OPERATION BASED ON SHARED AR TECHNOLOGY T. Suenaga 1, M. Nambu 1, T. Kuroda 2, O. Oshiro 2, T. Tamura 1, K. Chihara 2 1 National Institute for Longevity Sciences,

More information

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY. Augmented Reality-An Emerging Technology

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY. Augmented Reality-An Emerging Technology [Lotlikar, 2(3): March, 2013] ISSN: 2277-9655 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY Augmented Reality-An Emerging Technology Trupti Lotlikar *1, Divya Mahajan 2, Javid

More information

Usability and Playability Issues for ARQuake

Usability and Playability Issues for ARQuake Usability and Playability Issues for ARQuake Bruce Thomas, Nicholas Krul, Benjamin Close and Wayne Piekarski University of South Australia Abstract: Key words: This paper presents a set of informal studies

More information

Interior Design with Augmented Reality

Interior Design with Augmented Reality Interior Design with Augmented Reality Ananda Poudel and Omar Al-Azzam Department of Computer Science and Information Technology Saint Cloud State University Saint Cloud, MN, 56301 {apoudel, oalazzam}@stcloudstate.edu

More information

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Adiyan Mujibiya The University of Tokyo adiyan@acm.org http://lab.rekimoto.org/projects/mirage-exploring-interactionmodalities-using-off-body-static-electric-field-sensing/

More information