Avatar: a virtual reality based tool for collaborative production of theater shows

Size: px
Start display at page:

Download "Avatar: a virtual reality based tool for collaborative production of theater shows"

Transcription

1 Avatar: a virtual reality based tool for collaborative production of theater shows Christian Dompierre and Denis Laurendeau Computer Vision and System Lab., Laval University, Quebec City, QC Canada, G1K 7P4 cdompier@gel.ulaval.ca, denis.laurendeau@gel.ulaval.ca Abstract One of the more important limitations of actual tools for performing arts production and design is that collaboration between designers is hard to achieve. In fact, designers must actually be co-located to collaborate in the design of a show, something that is not always possible. While teleconference tools could be used to partially solve this problem, this solution offers no direct interactivity and no synchronization between designers. Also some problems like perspective effects and single viewpoint constrained by the camera are inherent to this solution. Specialized software for performing arts design (e.g. "Life Forms") do not generally provide real-time collaboration and are not really convenient for collaborative work. Also, these systems are often expensive and complex to operate. A more adapted solution combining concepts from virtual reality, network technology, and computer vision has then been specifically developed for collaborative work by performing arts designers. This paper presents a virtual reality application for supporting distributed collaborative production of theater shows resulting from our research. Among other constraints, this application has to ensure that the virtual scene that is being shared between multiple designers is always in sync (by use of computer vision) with a real counterpart and that this synchronization is achieved in real-time. Also, system cost must be kept as low as possible, platform independence must be achieved whenever possible and, since it is to be used by people that are not computer experts, the application has to be user-friendly. Introduction In March of 2004, the LANTISS laboratory (for Laboratoire des nouvelles technologies de l image, du son et de la scène) was created at Laval University in Quebec. The mission of this laboratory is to foster the use of information technology in the domain of performing arts design. In this context, a research project called The Virtual Mockup has been developed. The main objective of this project is to provide performing arts directors with tools for distributed interactive design of live performances. To reach this goal, an application called Avatar combining virtual reality, network technology, and computer vision was built. This application provides a versatile yet simple to use environment for collaborative performance design. Directors usually exploit a miniature model of a scene, called mockup ( castelet in French), in order to design performances. Our application does not aim at replacing this popular traditional tool, which directors do appreciate for various reasons, but Avatar rather augments it by a virtual component called the virtual mockup that can be shared by several directors in a collaborative performance design. Designers thus still use the real mockup, but now use it as an input/output peripheral for the virtual With Avatar, directors design a performance as they usually do using a real mockup but can now also count on a virtual replica of the scene that is synchronized with it and that can be shared by directors located at different geographical locations. Using Avatar, directors (local and remote) can visualize in 3D the scene that is actually being designed on the real In addition, directors at remote locations can modify the virtual mockup to collaborate in the design of the show. These changes are then transposed to the real mockup to keep all representations of the scene in sync. An important constraint in this project is that this synchronization between the virtual mockup and the real mockup be implemented in real-time in order to be efficient. Other constraints are that system cost be kept low (e.g. by the use of open source support tools) and that platform independence be as wide as possible since different designers could use different operating systems. This paper describes the different components of Avatar that allow interactive collaborative design of performing arts performances using both a real mockup and a shared virtual mockup of a scene. The different tools that were available for implementing Avatar are described first, followed by the solution that was adopted.

2 The paper then describes the collaborative aspects of Avatar that allow distributed collaborative work for design in performing arts. Although this approach is not unique to Avatar, the paper describes the constraints to which Avatar is submitted and the solutions that are proposed to face the challenges specific to performing arts design. The validity of the solution is supported by experimental results. Finally, the improvements that could be brought to Avatar to increase its performance are presented. Avatar as a Virtual Reality Application Before being a performing arts support tool, Avatar is first a Virtual Reality application. Consequently, it is very important to design Avatar using tools that are adapted to VR in order to avoid reinventing the wheel. With respect to graphics rendering of the virtual mockup, several rendering engines are currently available. The Visualization Toolkit (VTK) [1] has been chosen among existing graphics rendering tools first because it supports stereoscopic visualization and, secondly, since it allows the development of platform independent applications. VTK is also an open source package that is readily available. Since Avatar is to be used by non computer-specialists, its Graphical User Interface (GUI) has to be simple and easy to use. Qt [2] has thus been chosen to implement Avatar since it eases the development of GUIs and is platform independent. Qt is also an open source library for most operating systems. The VTK_Qt API developed by Carsten Kübler [3] has also been used in order to allow VTK and Qt to be combined in a single application. This also allows maintaining the open source flavor of Avatar, which is entirely developed in C++ and currently runs on Windows, Linux, and Mac OS X. In order to offer directors the opportunity to visualize the virtual mockup in 3D through stereoscopy, Avatar supports two different stereo visualization modes: active visualization with stereo goggles (Crystal Eyes) and cheaper anaglyphs when cost is an issue. Avatar is an open source application that is available on SourceForge [4] and can be downloaded at the following Internet address: Now that the development environment onto which Avatar is based has been described, the next section describes how a real mockup can be used as an input device for collaborative performing arts design using a virtual Using a real mockup as an input device for interactive performing arts design A major objective of Avatar is to allow directors to use a traditional mockup as a Virtual Reality input device for the virtual mockup that is being shared by remote directors participating in the design. Consequently, Avatar must allow the use of figurines of actors in the real mockup as VR peripheral devices for editing a scene. This implies that figurines in the real mockup must be associated with their virtual correspondent in the virtual This association between real and virtual figurines of actors is achieved by using computer vision. In Avatar, real figurines manipulated by a director are tracked by a cheap webcam and their pose is computed in real-time. The poses are then applied to the corresponding virtual figurines in order to maintain the coherence between the real and virtual mockups. Several approaches have been proposed in the literature for pose estimation in the context of Augmented Reality [6] [19]. Some approaches [7] [8] are based on global pose estimation techniques and use object features for estimating pose and imply a training phase by the object recognition / pose estimation system. Other techniques rather adopt a local approach for pose estimation and again use feature points that are matched to a priori models of the objects to be tracked. The challenge consists in finding reliable feature points that can be used for robust pose estimation. Shi and Tomasi [9] describe a criterion for choosing reliable feature points. Local techniques are generally more robust than global techniques since they tolerate partial occlusion of the objects for which pose must be estimated, especially when 3D pose must be estimated from 2D images [10] [11]. The major problem of using natural object features is that image noise makes feature matching more difficult with the stored model [12] [11]. Several approaches have been proposed for solving the feature-matching problem. For instance, State et al. [13] combines feature-based pose estimation with magnetic pose tracking. Park, You and Neumann [14] [15] use artificial targets for pose tracking initialization and update. The use of artificial markers improves the robustness of pose estimation significantly. Some techniques even rely on artificial markers only since such markers are easier to segment. Typical markers are color coded targets [16] [17] or other types of easily identifiable targets [5] [18] [20]. In Avatar, ARToolkit [5] has been used for estimating the pose of real figurines using 2D images provided by a webcam observing the figurines manipulated in the real mockup by the director.

3 ARToolkit consists in a library of C functions for the development of augmented reality applications. It allows real-time pose tracking of objects to be achieved using artificial targets. Since it uses very simple computer vision principles, ARToolkit allows real-time tracking of objects, a prerequisite in Avatar. In Avatar, artificial markers (i.e. targets) are glued to real figurines that are manipulated by the director. By observing the targets with a webcam, the pose of the figurines is computed in real-time and applied to corresponding virtual figurines as illustrated in Figure 1. The figurines in the real mockup are thus used as input devices for maintaining the pose of their virtual counterpart in the virtual In addition, it is not required that the virtual figurines look exactly the same as the real figurine. Consequently, figurines in the real mockup can be simple blocks while the virtual figurines may be complex actors that are rendered in stereo in the GUI of remote directors. Figure 1: A scene in the real mockup can be recreated in a virtual mockup in Avatar by tracking the pose of actors using a low cost camera. Support of Collaborative Work in Avatar As described above, the figurines of the real mockup are used, through the input interface provided by ARToolkit, to change the state of the virtual Since Avatar allows several users to share the same virtual mockup, a client-server architecture has been adopted to allow of the virtual mockup to be shared. In Avatar, an instance of the virtual mockup is maintained in perfect synchronization with the real mockup on the server node. Clients also have their own copy of the instance stored on the server and any change brought to the virtual mockup on the server is automatically sent to copies on the clients, which are updated to reflect these changes. In addition, changes brought by remote directors to the copies on clients are sent to the server and broadcast to other clients. These changes can be the displacement of virtual figurines, etc. While this solution could first seem efficient, it in fact implies some serious problems with the use of real figurines since the manipulation of a figurine by a director does not impose the displacement of its counterpart in other real mockups. The virtual mockup must then have multiple configurations corresponding to each real mockup at the same time. So enabling the use of real mockups to control the state of the shared virtual mockup in a collaborative design raises some problems. In Avatar, the solution is that only the director using the server of the application has access to a real mockup of the scene. Otherwise, it would be very complex to attempt to register more than one single real mockup and to keep these multiple instances in sync with each other. Imposing the constraint of having one single real mockup is not really a problem since: i) the shared virtual mockup is in sync with the real mockup and thus allow remote directors to have a replica of the real mockup in their own virtual world, ii) real mockups occupy a lot of space and must be stored in a storeroom when not in use. Consequently, maintaining more that one real mockup is more expensive and is not efficient. However, even when implementing the single real mockup solution, Avatar still faces the problem of keeping the real mockup in sync with the virtual mockup when remote directors modify the state of the virtual Thus some way of maintaining the synchronization between the shared virtual mockup and the (single) real mockup must be found. In addition, since the virtual figurines always correspond to real figurines whose pose are tracked by the camera observing the real mockup, virtual figurines are instantly brought back to the pose of the real figurine even though their position has been changed in the virtual mockup and updated momentarily in all instances of the virtual mockup! In this sense, clients directly moving virtual figurines in their instance of the virtual mockup then only induce a very short-lived change since the state of the scene is instantly updated to the status of the real mockup (and this change is instantly broadcast to the copies of the virtual mockup stored on client nodes). The solution that is proposed to solve these problems is to attach two identical virtual figurines (in the server s copy of the virtual mockup) to each real figurine in the real One of these two virtual figurines is called the

4 pseudo-virtual figurine and the second is called the purely virtual figurine. The pose of the pseudo-virtual figurine is always associated with the pose of the real figurine in the real mockup while the pose of the purely virtual figurine in the virtual mockup is shared with the copies of the same figurine on the client nodes (which only have one copy of each figurine). When the pose of a figurine is changed on a client node, the new pose is sent to the server, which updates the pose of its copy of the purely virtual figurine and sends this update to the other client nodes. Then, the director on the server node moves the real figurine in the real mockup until the pose of the pseudo-virtual figurine, computed by ARToolkit, matches the one of the purely virtual figurine (shared with the clients). Once both poses match within an acceptable error, the pose of the purely virtual figurine on the server is snapped to the pose of the pseudo-virtual figurine and the final pose is then broadcast to client nodes, which are now in perfect sync with the server. Figure 2 displays the solution implemented in Avatar to allow collaborative work in conjunction with the use of a real Figure 2: Avatar allows multiple users to share a virtual world even with the use of real figurines as input peripheral. In this example, a client first moves a virtual figurine in the virtual This change is automatically sent to the server (and then broadcast to other clients). This change is applied to the purely virtual actor on the server s instance of the virtual After moving the real figurine (and so the pseudo-virtual actor) until it meets the right pose, the purely virtual actor is snapped to the pseudo-virtual one and its pose is corrected. This modification is then broadcast to all clients to ensure that every instance of the virtual mockup is in sync with the real

5 Results Figure 3: Some elements of the GUI of Avatar Designed to be used by non computer-specialists, Avatar presents a user-friendly GUI that is easily understandable and usable. Figure 3 shows some of the different options available in the GUI of Avatar. It is easy to build a scene in Avatar. All one needs to do is to add actors to the virtual world. Actors can then be positioned anywhere in the virtual world. When an actor is added to the scene (by any user), it is added in each instance of the shared virtual The user on the server node also needs to inform Avatar of which marker this new actor is associated with in the real mockup if he wants to use this physical tool to design the scene. Users have a lot of liberty when working with Avatar. For example, users can change the background color, use a reference grid and apply any texture they wish, move, rotate, scale actors at will, move the camera, decede whether or not to use one of the two stereo types built-in, etc. As a result, designers can build a complete scene very easily with Avatar. Figure 4 shows an example of a scene (from the movie The Matrix Reloaded (2003)) designed with Avatar. Figure 4: Example of a scene designed with Avatar. One of the possibilities offered to directors while designing a performance with Avatar is to save custom viewpoints and return to them or to one of the standard viewpoints predefined in Avatar as shown in Figure 5.

6 Another improvement that could be brought to Avatar would be to model light sources in the real mockup in order to add them in the virtual copies. As of today, Avatar does not implement collision detection between virtual figurines in the virtual Currently, in order to avoid collisions, certain configurations of real figurines are simply not implemented when the time comes to synchronize the real mockup to the virtual Finally, it would be interesting to animate virtual figurines in the virtual mockup in order to make the environment more realistic. Figure 5: The camera position can be reverted to viewpoints predefined in Avatar as well as custom viewpoints defined by users. Conclusion A tool for supporting distributed interactive performing arts design has been described. The tool, called Avatar, mixes computer vision and virtual reality to provide directors with the opportunity to modify at will the scene of interest on both a real and virtual mockup of the actual scene that will be built for a given show. Figure 6 illustrates the possibilities offered by Avatar to users in performing arts design. Even though Avatar has been designed for supporting the design of performing arts productions, it is not limited to this field and can be used in any applications for which it is important to maintain the synchronization between the geometry of a real environment and its virtual counterparts. In addition, Avatar has been designed as a low-cost platform-independent software package that is based on open source tools and that can be used by non computerexperts. Its interface is very simple to understand and use and it provides an interesting support for collaborative work in performing arts. Future work aims at including more realistic figurines in the virtual mockup and to offer users the opportunity to include an environment map in the virtual Currently, the visual feedback provided by Avatar is neither a function of the position nor the movement of the user and the latter has to use his mouse to navigate in the virtual world. It is thought that ARToolkit could be exploited for this purpose and that could be an interesting feature to add to Avatar. To achieve this, ARToolkit would have to be adapted for use in low-light conditions such as those found in VR rooms. Figure 6: Avatar can be used as a new interactive tool for performing arts design allowing collaborative work between multiple disigners.

7 References [1] [2] [3] [4] [5] [6] Azuma, R., et al., Recent Advances in Augmented Reality, IEEE Computer Graphics and Applications, Vol. 21, No. 6, 2001, p [7] S. K. Nayar, S. A. Nene, and H. Murase, Real-Time 100 Object Recognition System, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 18, no. 12, 1996, p [8] P. Viola and M. Jones, Rapid Object Detection using a Boosted Cascade of Simple Features, in Conference on Computer Vision and Pattern Recognition, 2001, p [9] J. Shi and C. Tomasi. Good features to track, In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'94), IEEE Computer Society, Seattle, Washington, June 1994, p [10] Vincent Lepetit, Pascal Lagger, and Pascal Fua, Randomized Trees for Real-Time Keypoint Recognition, In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), IEEE Computer Society, San Diego, CA, June 2005, p [11] Takashi Okuma, Takeshi Kurata, and Katsuhiko Sakaue, A Natural Feature-Based 3D Object Tracking Method for Wearable Augmented Reality, In Proc. The 8th IEEE International Workshop on Advanced Motion Control (AMC'04) in Kawasaki, Japan, 2004, p [12] Gilles Simon, and Marie-Odile Berger, Reconstructing while registering: A novel approach for markerless augmented reality, In Proc. IEEE and ACM International Symposium on Mixed and Augmented Reality, 2002, p [13] Andrei State et al., Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking, Computer Graphics, 1996, p [14] Jun Park, Suya You, and Ulrich Neumann, Natural Feature Tracking for Extendible Robust Augmented Realities, First IEEE International Workshop on Augmented Reality (IWAR98), USA, [15] Jun Park, Suya You, and Ulrich Neumann, Extending Augmented Reality with natural feature tracking, SPIE on Telemanipulator and Telepresence Technologies, Vol.3524, Nov [16] Ulrich Neumann, and Youngkwan Cho, A self-tracking augmented reality system, In Proc. VRST 96, 1996, p [17] Ulrich Neumann et al., Augmented Reality Tracking in Natural Environments, Mixed Reality - Merging Real and Virtual Worlds, Ohmsha & Springer-Verlag, 1999, p [18] Jun Rekimoto, Matrix: A Realtime Object Identification and Registration Method for Augmented Reality, AP- CHI 98, [19] Azuma, R.T., A survey of augmented reality, Presence, vol.6, No.4, 1997, p [20] Hirokazu Kato, and Mark Billinghurst, Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System, In Proc. the 2nd IEEE and ACM International Workshop on Augmented Reality 99, 1999, p

HeroX - Untethered VR Training in Sync'ed Physical Spaces

HeroX - Untethered VR Training in Sync'ed Physical Spaces Page 1 of 6 HeroX - Untethered VR Training in Sync'ed Physical Spaces Above and Beyond - Integrating Robotics In previous research work I experimented with multiple robots remotely controlled by people

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

Collaborative Virtual Environment for Industrial Training and e-commerce

Collaborative Virtual Environment for Industrial Training and e-commerce Collaborative Virtual Environment for Industrial Training and e-commerce J.C.OLIVEIRA, X.SHEN AND N.D.GEORGANAS School of Information Technology and Engineering Multimedia Communications Research Laboratory

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

Augmented Reality Lecture notes 01 1

Augmented Reality Lecture notes 01 1 IntroductiontoAugmentedReality Lecture notes 01 1 Definition Augmented reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are augmented by computer-generated

More information

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Virtual Co-Location for Crime Scene Investigation and Going Beyond Virtual Co-Location for Crime Scene Investigation and Going Beyond Stephan Lukosch Faculty of Technology, Policy and Management, Systems Engineering Section Delft University of Technology Challenge the

More information

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY 1 RAJU RATHOD, 2 GEORGE PHILIP.C, 3 VIJAY KUMAR B.P 1,2,3 MSRIT Bangalore Abstract- To ensure the best place, position,

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14: Part 14: Augmented Reality Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Introduction to Augmented Reality Augmented Reality Displays Examples AR Toolkit an open source software

More information

UMI3D Unified Model for Interaction in 3D. White Paper

UMI3D Unified Model for Interaction in 3D. White Paper UMI3D Unified Model for Interaction in 3D White Paper 30/04/2018 Introduction 2 The objectives of the UMI3D project are to simplify the collaboration between multiple and potentially asymmetrical devices

More information

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Survey of Mobile Augmentation for Mobile Augmented Reality System A Survey of Mobile Augmentation for Mobile Augmented Reality System Mr.A.T.Vasaya 1, Mr.A.S.Gohil 2 1 PG Student, C.U.Shah College of Engineering and Technology, Gujarat, India 2 Asst.Proffesor, Sir Bhavsinhji

More information

Interior Design with Augmented Reality

Interior Design with Augmented Reality Interior Design with Augmented Reality Ananda Poudel and Omar Al-Azzam Department of Computer Science and Information Technology Saint Cloud State University Saint Cloud, MN, 56301 {apoudel, oalazzam}@stcloudstate.edu

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

6 System architecture

6 System architecture 6 System architecture is an application for interactively controlling the animation of VRML avatars. It uses the pen interaction technique described in Chapter 3 - Interaction technique. It is used in

More information

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture 12 Window Systems - A window system manages a computer screen. - Divides the screen into overlapping regions. - Each region displays output from a particular application. X window system is widely used

More information

AR Tamagotchi : Animate Everything Around Us

AR Tamagotchi : Animate Everything Around Us AR Tamagotchi : Animate Everything Around Us Byung-Hwa Park i-lab, Pohang University of Science and Technology (POSTECH), Pohang, South Korea pbh0616@postech.ac.kr Se-Young Oh Dept. of Electrical Engineering,

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

VIRTUAL REALITY AND SIMULATION (2B)

VIRTUAL REALITY AND SIMULATION (2B) VIRTUAL REALITY AND SIMULATION (2B) AR: AN APPLICATION FOR INTERIOR DESIGN 115 TOAN PHAN VIET, CHOO SEUNG YEON, WOO SEUNG HAK, CHOI AHRINA GREEN CITY 125 P.G. SHIVSHANKAR, R. BALACHANDAR RETRIEVING LOST

More information

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR LOOKING AHEAD: UE4 VR Roadmap Nick Whiting Technical Director VR / AR HEADLINE AND IMAGE LAYOUT RECENT DEVELOPMENTS RECENT DEVELOPMENTS At Epic, we drive our engine development by creating content. We

More information

Introduction to Virtual Reality (based on a talk by Bill Mark)

Introduction to Virtual Reality (based on a talk by Bill Mark) Introduction to Virtual Reality (based on a talk by Bill Mark) I will talk about... Why do we want Virtual Reality? What is needed for a VR system? Examples of VR systems Research problems in VR Most Computers

More information

Augmented Reality- Effective Assistance for Interior Design

Augmented Reality- Effective Assistance for Interior Design Augmented Reality- Effective Assistance for Interior Design Focus on Tangible AR study Seung Yeon Choo 1, Kyu Souk Heo 2, Ji Hyo Seo 3, Min Soo Kang 4 1,2,3 School of Architecture & Civil engineering,

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

Multi-Platform Soccer Robot Development System

Multi-Platform Soccer Robot Development System Multi-Platform Soccer Robot Development System Hui Wang, Han Wang, Chunmiao Wang, William Y. C. Soh Division of Control & Instrumentation, School of EEE Nanyang Technological University Nanyang Avenue,

More information

Networked Virtual Environments

Networked Virtual Environments etworked Virtual Environments Christos Bouras Eri Giannaka Thrasyvoulos Tsiatsos Introduction The inherent need of humans to communicate acted as the moving force for the formation, expansion and wide

More information

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control Hyun-sang Cho, Jayoung Goo, Dongjun Suh, Kyoung Shin Park, and Minsoo Hahn Digital Media Laboratory, Information and Communications

More information

Face Registration Using Wearable Active Vision Systems for Augmented Memory

Face Registration Using Wearable Active Vision Systems for Augmented Memory DICTA2002: Digital Image Computing Techniques and Applications, 21 22 January 2002, Melbourne, Australia 1 Face Registration Using Wearable Active Vision Systems for Augmented Memory Takekazu Kato Takeshi

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Designing Semantic Virtual Reality Applications

Designing Semantic Virtual Reality Applications Designing Semantic Virtual Reality Applications F. Kleinermann, O. De Troyer, H. Mansouri, R. Romero, B. Pellens, W. Bille WISE Research group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium

More information

Virtual Reality and Full Scale Modelling a large Mixed Reality system for Participatory Design

Virtual Reality and Full Scale Modelling a large Mixed Reality system for Participatory Design Virtual Reality and Full Scale Modelling a large Mixed Reality system for Participatory Design Roy C. Davies 1, Elisabeth Dalholm 2, Birgitta Mitchell 2, Paul Tate 3 1: Dept of Design Sciences, Lund University,

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists 4,000 116,000 120M Open access books available International authors and editors Downloads Our

More information

One Size Doesn't Fit All Aligning VR Environments to Workflows

One Size Doesn't Fit All Aligning VR Environments to Workflows One Size Doesn't Fit All Aligning VR Environments to Workflows PRESENTATION TITLE DATE GOES HERE By Show of Hands Who frequently uses a VR system? By Show of Hands Immersive System? Head Mounted Display?

More information

Augmented reality for machinery systems design and development

Augmented reality for machinery systems design and development Published in: J. Pokojski et al. (eds.), New World Situation: New Directions in Concurrent Engineering, Springer-Verlag London, 2010, pp. 79-86 Augmented reality for machinery systems design and development

More information

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY Marcella Christiana and Raymond Bahana Computer Science Program, Binus International-Binus University, Jakarta

More information

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Occlusion based Interaction Methods for Tangible Augmented Reality Environments Occlusion based Interaction Methods for Tangible Augmented Reality Environments Gun A. Lee α Mark Billinghurst β Gerard J. Kim α α Virtual Reality Laboratory, Pohang University of Science and Technology

More information

An Open Robot Simulator Environment

An Open Robot Simulator Environment An Open Robot Simulator Environment Toshiyuki Ishimura, Takeshi Kato, Kentaro Oda, and Takeshi Ohashi Dept. of Artificial Intelligence, Kyushu Institute of Technology isshi@mickey.ai.kyutech.ac.jp Abstract.

More information

Lab 5: Advanced camera handling and interaction

Lab 5: Advanced camera handling and interaction Lab 5: Advanced camera handling and interaction Learning goals: 1. Understanding motion tracking and interaction using Augmented Reality Toolkit 2. Develop methods for 3D interaction. 3. Understanding

More information

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method

More information

Spatial augmented reality to enhance physical artistic creation.

Spatial augmented reality to enhance physical artistic creation. Spatial augmented reality to enhance physical artistic creation. Jérémy Laviole, Martin Hachet To cite this version: Jérémy Laviole, Martin Hachet. Spatial augmented reality to enhance physical artistic

More information

Haptic Rendering and Volumetric Visualization with SenSitus

Haptic Rendering and Volumetric Visualization with SenSitus Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,

More information

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience , pp.150-156 http://dx.doi.org/10.14257/astl.2016.140.29 Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience Jaeho Ryu 1, Minsuk

More information

Extending X3D for Augmented Reality

Extending X3D for Augmented Reality Extending X3D for Augmented Reality Seventh AR Standards Group Meeting Anita Havele Executive Director, Web3D Consortium www.web3d.org anita.havele@web3d.org Nov 8, 2012 Overview X3D AR WG Update ISO SC24/SC29

More information

Future Directions for Augmented Reality. Mark Billinghurst

Future Directions for Augmented Reality. Mark Billinghurst Future Directions for Augmented Reality Mark Billinghurst 1968 Sutherland/Sproull s HMD https://www.youtube.com/watch?v=ntwzxgprxag Star Wars - 1977 Augmented Reality Combines Real and Virtual Images Both

More information

Mobile Interaction with the Real World

Mobile Interaction with the Real World Andreas Zimmermann, Niels Henze, Xavier Righetti and Enrico Rukzio (Eds.) Mobile Interaction with the Real World Workshop in conjunction with MobileHCI 2009 BIS-Verlag der Carl von Ossietzky Universität

More information

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality R. Marín, P. J. Sanz and J. S. Sánchez Abstract The system consists of a multirobot architecture that gives access

More information

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design Zhang Liang e-mail: 76201691@qq.com Zhao Jian e-mail: 84310626@qq.com Zheng Li-nan e-mail: 1021090387@qq.com Li Nan

More information

Implementation of Image processing using augmented reality

Implementation of Image processing using augmented reality Implementation of Image processing using augmented reality Konjengbam Jackichand Singh 1, L.P.Saikia 2 1 MTech Computer Sc & Engg, Assam Downtown University, India 2 Professor, Computer Sc& Engg, Assam

More information

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation University of California, Santa Barbara CS189 Fall 17 Capstone VR Telemedicine Product Requirement Documentation Jinfa Zhu Kenneth Chan Shouzhi Wan Xiaohe He Yuanqi Li Supervised by Ole Eichhorn Helen

More information

Generating Virtual Environments by Linking Spatial Data Processing with a Gaming Engine

Generating Virtual Environments by Linking Spatial Data Processing with a Gaming Engine Generating Virtual Environments by Linking Spatial Data Processing with a Gaming Engine Christian STOCK, Ian D. BISHOP, and Alice O CONNOR 1 Introduction As the general public gets increasingly involved

More information

Interactive intuitive mixed-reality interface for Virtual Architecture

Interactive intuitive mixed-reality interface for Virtual Architecture I 3 - EYE-CUBE Interactive intuitive mixed-reality interface for Virtual Architecture STEPHEN K. WITTKOPF, SZE LEE TEO National University of Singapore Department of Architecture and Fellow of Asia Research

More information

AUGMENTED REALITY AS AN AID FOR THE USE OF MACHINE TOOLS

AUGMENTED REALITY AS AN AID FOR THE USE OF MACHINE TOOLS Engineering AUGMENTED REALITY AS AN AID FOR THE USE OF MACHINE TOOLS Jean-Rémy CHARDONNET 1 Guillaume FROMENTIN 2 José OUTEIRO 3 ABSTRACT: THIS ARTICLE PRESENTS A WORK IN PROGRESS OF USING AUGMENTED REALITY

More information

Tracking in Unprepared Environments for Augmented Reality Systems

Tracking in Unprepared Environments for Augmented Reality Systems Tracking in Unprepared Environments for Augmented Reality Systems Ronald Azuma HRL Laboratories 3011 Malibu Canyon Road, MS RL96 Malibu, CA 90265-4799, USA azuma@hrl.com Jong Weon Lee, Bolan Jiang, Jun

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

USER-ORIENTED INTERACTIVE BUILDING DESIGN * USER-ORIENTED INTERACTIVE BUILDING DESIGN * S. Martinez, A. Salgado, C. Barcena, C. Balaguer RoboticsLab, University Carlos III of Madrid, Spain {scasa@ing.uc3m.es} J.M. Navarro, C. Bosch, A. Rubio Dragados,

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.

More information

Stereoscopic Augmented Reality System for Computer Assisted Surgery

Stereoscopic Augmented Reality System for Computer Assisted Surgery Marc Liévin and Erwin Keeve Research center c a e s a r, Center of Advanced European Studies and Research, Surgical Simulation and Navigation Group, Friedensplatz 16, 53111 Bonn, Germany. A first architecture

More information

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People

Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People Atheer S. Al-Khalifa 1 and Hend S. Al-Khalifa 2 1 Electronic and Computer Research Institute, King Abdulaziz City

More information

Augmented Reality Interface Toolkit

Augmented Reality Interface Toolkit Augmented Reality Interface Toolkit Fotis Liarokapis, Martin White, Paul Lister University of Sussex, Department of Informatics {F.Liarokapis, M.White, P.F.Lister}@sussex.ac.uk Abstract This paper proposes

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

Augmented Reality: Its Applications and Use of Wireless Technologies

Augmented Reality: Its Applications and Use of Wireless Technologies International Journal of Information and Computation Technology. ISSN 0974-2239 Volume 4, Number 3 (2014), pp. 231-238 International Research Publications House http://www. irphouse.com /ijict.htm Augmented

More information

Mobile Telepresence Services for Virtual Enterprise

Mobile Telepresence Services for Virtual Enterprise Mobile Telepresence Services for Virtual Enterprise Petri Pulli, Peter Antoniac, Seamus Hickey University of Oulu - Department of Information Processing Science PAULA Project sponsored by Academy of Finland

More information

Activities at SC 24 WG 9: An Overview

Activities at SC 24 WG 9: An Overview Activities at SC 24 WG 9: An Overview G E R A R D J. K I M, C O N V E N E R I S O J T C 1 S C 2 4 W G 9 Mixed and Augmented Reality (MAR) ISO SC 24 and MAR ISO-IEC JTC 1 SC 24 Have developed standards

More information

A METHOD FOR DISTANCE ESTIMATION USING INTRA-FRAME OPTICAL FLOW WITH AN INTERLACE CAMERA

A METHOD FOR DISTANCE ESTIMATION USING INTRA-FRAME OPTICAL FLOW WITH AN INTERLACE CAMERA Journal of Mobile Multimedia, Vol. 7, No. 3 (2011) 163 176 c Rinton Press A METHOD FOR DISTANCE ESTIMATION USING INTRA-FRAME OPTICAL FLOW WITH AN INTERLACE CAMERA TSUTOMU TERADA Graduate School of Engineering,

More information

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface Kei Okada 1, Yasuyuki Kino 1, Fumio Kanehiro 2, Yasuo Kuniyoshi 1, Masayuki Inaba 1, Hirochika Inoue 1 1

More information

Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking

Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking Taehee Lee, Tobias Höllerer Four Eyes Laboratory, Department of Computer Science University of California, Santa Barbara,

More information

EnSight in Virtual and Mixed Reality Environments

EnSight in Virtual and Mixed Reality Environments CEI 2015 User Group Meeting EnSight in Virtual and Mixed Reality Environments VR Hardware that works with EnSight Canon MR Oculus Rift Cave Power Wall Canon MR MR means Mixed Reality User looks through

More information

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic

More information

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? # / Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? # Dr. Jérôme Royan Definitions / 2 Virtual Reality definition «The Virtual reality is a scientific and technical domain

More information

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM Annals of the University of Petroşani, Mechanical Engineering, 8 (2006), 73-78 73 VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM JOZEF NOVÁK-MARCINČIN 1, PETER BRÁZDA 2 Abstract: Paper describes

More information

Steady Steps and Giant Leap Toward Practical Mixed Reality Systems and Applications

Steady Steps and Giant Leap Toward Practical Mixed Reality Systems and Applications Steady Steps and Giant Leap Toward Practical Mixed Reality Systems and Applications Hideyuki Tamura MR Systems Laboratory, Canon Inc. 2-2-1 Nakane, Meguro-ku, Tokyo 152-0031, JAPAN HideyTamura@acm.org

More information

Intelligent Modelling of Virtual Worlds Using Domain Ontologies

Intelligent Modelling of Virtual Worlds Using Domain Ontologies Intelligent Modelling of Virtual Worlds Using Domain Ontologies Wesley Bille, Bram Pellens, Frederic Kleinermann, and Olga De Troyer Research Group WISE, Department of Computer Science, Vrije Universiteit

More information

Augmented reality as an aid for the use of machine tools

Augmented reality as an aid for the use of machine tools Augmented reality as an aid for the use of machine tools Jean-Rémy Chardonnet, Guillaume Fromentin, José Outeiro To cite this version: Jean-Rémy Chardonnet, Guillaume Fromentin, José Outeiro. Augmented

More information

LINKING CONSTRUCTION INFORMATION THROUGH VR USING AN OBJECT ORIENTED ENVIRONMENT

LINKING CONSTRUCTION INFORMATION THROUGH VR USING AN OBJECT ORIENTED ENVIRONMENT LINKING CONSTRUCTION INFORMATION THROUGH VR USING AN OBJECT ORIENTED ENVIRONMENT G. Aouad 1, T. Child, P. Brandon, and M. Sarshar Research Centre for the Built and Human Environment, University of Salford,

More information

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception

More information

A Mixed Reality Approach to HumanRobot Interaction

A Mixed Reality Approach to HumanRobot Interaction A Mixed Reality Approach to HumanRobot Interaction First Author Abstract James Young This paper offers a mixed reality approach to humanrobot interaction (HRI) which exploits the fact that robots are both

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Virtual Environments. Ruth Aylett

Virtual Environments. Ruth Aylett Virtual Environments Ruth Aylett Aims of the course 1. To demonstrate a critical understanding of modern VE systems, evaluating the strengths and weaknesses of the current VR technologies 2. To be able

More information

Realistic Visual Environment for Immersive Projection Display System

Realistic Visual Environment for Immersive Projection Display System Realistic Visual Environment for Immersive Projection Display System Hasup Lee Center for Education and Research of Symbiotic, Safe and Secure System Design Keio University Yokohama, Japan hasups@sdm.keio.ac.jp

More information

Scalable geospatial 3D client applications in X3D - Interactive, online and in real-time

Scalable geospatial 3D client applications in X3D - Interactive, online and in real-time Scalable geospatial 3D client applications in X3D - Interactive, online and in real-time Dipl.Inform.Univ Peter Schickel CEO Bitmanagement Software Vice President Web3D Consortium, Mountain View, USA OGC/Web3D

More information

USING ROBOCOMP AND KINECT IN AUGMENTED REALITY APPLICATIONS. Leandro P. Serrano July 2011, Coimbra

USING ROBOCOMP AND KINECT IN AUGMENTED REALITY APPLICATIONS. Leandro P. Serrano July 2011, Coimbra USING ROBOCOMP AND KINECT IN AUGMENTED REALITY APPLICATIONS Leandro P. Serrano July 2011, Coimbra Augmented Reality What is the Augmented Reality? Advantages Tools Problems Proposed solution Future work

More information

Augmented Reality Mixed Reality

Augmented Reality Mixed Reality Augmented Reality and Virtual Reality Augmented Reality Mixed Reality 029511-1 2008 년가을학기 11/17/2008 박경신 Virtual Reality Totally immersive environment Visual senses are under control of system (sometimes

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Multiple Presence through Auditory Bots in Virtual Environments

Multiple Presence through Auditory Bots in Virtual Environments Multiple Presence through Auditory Bots in Virtual Environments Martin Kaltenbrunner FH Hagenberg Hauptstrasse 117 A-4232 Hagenberg Austria modin@yuri.at Avon Huxor (Corresponding author) Centre for Electronic

More information

Polytechnical Engineering College in Virtual Reality

Polytechnical Engineering College in Virtual Reality SISY 2006 4 th Serbian-Hungarian Joint Symposium on Intelligent Systems Polytechnical Engineering College in Virtual Reality Igor Fuerstner, Nemanja Cvijin, Attila Kukla Viša tehnička škola, Marka Oreškovica

More information

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast. 11. Image Processing Image processing concerns about modifying or transforming images. Applications may include enhancing an image or adding special effects to an image. Here we will learn some of the

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

Immersive Authoring of Tangible Augmented Reality Applications

Immersive Authoring of Tangible Augmented Reality Applications International Symposium on Mixed and Augmented Reality 2004 Immersive Authoring of Tangible Augmented Reality Applications Gun A. Lee α Gerard J. Kim α Claudia Nelles β Mark Billinghurst β α Virtual Reality

More information

Visual Interpretation of Hand Gestures as a Practical Interface Modality

Visual Interpretation of Hand Gestures as a Practical Interface Modality Visual Interpretation of Hand Gestures as a Practical Interface Modality Frederik C. M. Kjeldsen Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate

More information