Augmenting Reality Through The Coordinated Use of Diverse Interfaces

Size: px
Start display at page:

Download "Augmenting Reality Through The Coordinated Use of Diverse Interfaces"

Transcription

1 Augmenting Reality Through The Coordinated Use of Diverse Interfaces Chris Greenhalgh, Steve Benford, Tom Rodden, Rob Anastasi, Ian Taylor, Martin Flintham, Shahram Izadi, Paul Chandler, Boriana Koleva, Holger Schnädelbach The Mixed Reality Laboratory The University of ottingham ottingham G7 2RD, UK Tel: {cmg, sdb, tar, rma, imt, mdf, sxi, pmc, bnk, ABSTRACT We present seven diverse augmented reality interfaces including tunnels from a virtual environment to fixed and mobile phones; digital activity meters for locating hotspots of activity in a parallel virtual world; a tripodmounted display for small groups to view virtual events; and public projections of shadows and sound from a virtual world. An analysis of how these match different design goals and real world constraints demonstrates the potential utility of each. We explore how the use of a shared underlying virtual world enables multiple interfaces such as these to be coordinated to provide a rich and coherent augmented reality experience. Keywords Augmented reality, mixed reality, mobile applications. ITRODUCTIO The archetypal approach to augmented reality (AR) uses a wearable or handheld device to supplement a single user s experience of a physical environment. For example, the user may don a wearable computer with tracking and specialized IO devices (such as a see-through head mounted display). This allows them to receive or recall additional context - relevant information superimposed on their normal experience of physical spaces and/or artifacts [1]. Alternatively the user may carry a handheld device. A typical application for this kind of system has been in the production of electronic guides, where users are presented with information about their current location. This class of system ranges from museum based systems [3] to broader town and city guides [5]. The design, construction and operation of these devices is strongly influenced by the practical problems of augmented reality, especially when intended for outdoors use [7]. For example, Azuma [2] discusses: - displays being hard to read in sunlight; - tracking having variable accuracy; - portability being limited, especially as a function of power requirements. This paper introduces seven augmented reality interfaces that respond to these and other practical constraints in quite different ways. The interfaces outlined in this paper are intended to be used in tandem to allow a diverse population of users such as the inhabitants of a city to experience events that take place within a parallel virtual world. Thus rather than provide a single point of contact between the physical and the virtual we wish to realize a much broader augmented reality experience. Using these interfaces as a starting point, we identify and chart the factors that determine each interface s appropriateness for overlaying digital information on a physical environment as a function of intended use and context. We then discuss how the use of an underlying shared virtual environment enables diverse collections of such interfaces to work together in a concerted way to provide rich and coherent augmented reality experiences. PROTOTPE ITERFACES We begin with seven prototype interfaces that illustrate new and diverse approaches to augmented reality. Each establishes a different relationship between digital information (in our case, a virtual world) and a physical environment. These are: - The use of fixed and public telephones to create tunnels between physical and virtual worlds; - The extension of these to mobile phones; - The combination of a PDA, GPS device and wireless networking to create a digital activity meter, an interface for locating hotspots of activity in a parallel virtual world and displaying these on a radar display; - A second digital activity meter that produces an sonification rather than a visual display;

2 - A portable tripod-mounted display called an augurscope through which users may view virtual activity when outdoors; - The projection of a virtual world into public space as virtual shadows; - A second projection of a virtual world as an ambient soundfield. Audio tunnels using fixed or public telephones Augmented or mixed reality does not necessarily require the development of novel devices. In fact, it can exploit devices that already exist and are already in use in the physical world as a means of augmentation. One example of this is the use of public payphones (and other fixed phones). Payphones are an established component of many urban landscapes, providing a potential bridge between physical and virtual space. The locations of public payphones can be determined in advance of an experience and these can then be used to allow activities within the virtual world to be heard from corresponding locations within the physical. This communication can also be two-way, with the information from the payphone link made available to the virtual users. The result is to create an tunnel between the digital and physical world. As an aside, if the identity of the person who answers or makes the call can be determined then the system also has a precise if momentary fix on their physical location. In other words, telephones can be used as a coarse tracking system. In our prototype, an on-line (virtual-only) user mo ving around a parallel virtual world can approach a virtual representation of a payphone. The system responds by phoning the corresponding payphone and establishing an channel to it from the corresponding part of the parallel virtual world. Figure 1 shows an avatar approaching a payphone in the virtual world. This automatically triggers a phone call to transmit the avatar s to the equivalent physical payphone. Audio tunnels using mobile phones Mobile phones are carried and used in vast numbers, especially in Europe, orth America and the Pacific Rim. As with fixed phones, they are an established pre-existing technology that can be appropriated to support augmented reality, rather than a completely new device. Obviously it is the nature of mobile phones that they move within the physical world, although they tend to remain linked to one person or a small number of related people (e.g. family members or business colleagues). In our prototype system a mobile phone can be accessed from the virtual world exactly like a fixed phone, above, although the spatial correspondence between the physical and parallel virtual world is then destroyed. However, the mobile phone can then be supplemented with some form of tracking technology in order to place it correctly within the virtual world. In this case, the mobile phone user (as well as the virtual user) may also trigger the creation of the tunnel by bumping into a virtual user. There is ongoing work on positioning of phones using just the phone network radio strength, for example to support emergency services in locating callers, however this information is not generally available (for reasons of security and privacy). The approach that we have prototyped is to use a GPS receiver and a PDA (a Palm Pilot) connected to the mobile phone, that notifies the virtual world via an SMS message when the mobile phone is moved physically. This allows the phone s virtual representation to be kept up to date. Figure 2 shows the hardware carried by the mobile user (left) and a corresponding image of their avatar in a virtual environment (right) in which an tunnel is active (shown by the presence of the yellow pyramid). PDA mobile phone GPS receiver Figure 2: mobile phone with PDA and GPS receiver Figure 1: A virtual user approaches to payphone to establish an tunnel A digital activity meter with a radar display There are various devices in fiction and fact that are specifically tailored to locate objects, places and activities within the physical world. For example, geiger counters are used to locate sources of radioactivity, psycho-kinetic energy meters (PK-meters) are used by paranormal

3 investigators to detect otherworldly presence and activity, and resistivity meters are used by archeologists to locate historical artifacts and buildings. Inspired by these, we have created two handheld digital activity meters. These alert the user to the presence of nearby digital activity, such as avatars or virtual objects in a parallel virtual world. They could be used to support a user who is searching for a particular AR experience (such as a virtual artifact or event) within a larger but less augmented space. These interfaces are designed to support searching by individual users. The first example combines a GPS receiver (to determine the user s physical position), a wireless network (based on WaveLA) and a PDA (Compaq ipaq with colour display). It presents the user with a radar style display, indicating the relative positions of nearby artifacts and avatars in the virtual world. Figure 3 shows the radar indicating the presence of two nearby avatars as dots in the central circle. searching for a virtual object (a fragment of a bowl) in the parallel virtual world on the right. The avatar on the right shows their current position from the GPS. Figure 4: Locating part of a virtual bowl. The Augurscope a portable, tripod-mounted display The augurscope (figure 5) is a portable augmented reality interface for use by small groups in open (indoor or outdoor) locations. It is used to directly view the parallel virtual world, for example after particular content has been located using a digital activity meter. Its goals, detailed design and initial application and evaluation are described by a sister paper to this one that is also submitted to CHI The following is a brief summary for completeness. Figure 3: Digital activity meter with virtual radar display showing nearby avatars. A digital activity meter with a sonic display Our second example uses an abstract presentation rather than 2D graphics to give the user proximity information about multiple nearby virtual objects. Each virtual object is associated with its own tone. As they move around the physical environment, the user hears a mix of tones that indicates the relative proximities of the objects (each tone increases in volume and frequency when the object is closer). Searching is typically a single element of a guide-type or general-purpose AR system (e.g. [5]). In contrast, these interfaces support searching as an activity in itself, whereby searching may be as significant as finding. This has been used to support a virtual archeology experience, in which users search for hidden virtual artifacts, which they take back to a fixed installation for detailed viewing [4]. Figure 4 show two users in the physical world on the left who are Figure 5: The Augurscope in use. The augurscope is based on a tripod-mounted laptop computer. A GPS receiver (for outdoor use) and electronic compass provide global location information. An onboard accelerometer and rotary encoder allow the virtual viewpoint to be interactively manipulated by panning and tilting the physical device on its tripod. As the scope is moved the laptop s display changes to show the corresponding view of the parallel virtual world, allowing users to view the virtual world alongside the corresponding part of the physical world. The augurscope is a public device designed to allow a small group of users to cluster around the view of the virtual world.

4 Virtual shadows as public projections Our next prototype interface has been inspired by the presence of shadows in the everyday world. Shadows provide indirect projections of physical objects and activity onto public surfaces, typically outdoors, in a way that is at once familiar and distorted (and potentially aesthetically pleasing). Various VR artists have previously incorporated shadows as secondary displays of activity (e.g. Char Davies groundbreaking 1995 installation Osmose [9]). We have recently experimented with virtual shadows, projections of a virtual world into a public space that are deliberately simplified and distorted (like a shadow) so as to convey a sense of virtual presence and activity without the need for accurate positioning or overlaid 3D graphics. The primary goal is to create an ambient or impressionistic display, particularly aimed at bystanders and larger groups or crowds who are not typically addressed by current AR interfaces. A shadow projection can be realized as a viewpoint at a particular location within a virtual world that is then projected into a (public) place that normally corresponds to the virtual location. As users and objects in the virtual world move, the shadows projected into the physical world change accordingly. Figure 6 shows an example of projecting digital shadows of avatars onto the side of a large building. These shadows were projected over a distance or approximately 200 meters using a projector with a long throw lens. Unlike most of the interfaces described so far the devices that produce shadow projections are typically fixed and embedded within the environment, rather than being mobile. However, we have also experimented with an intermediate (semi-mobile) approach, running projectors and PCs from the back of a parked van, using a generator for power. Another possibility would be to use steerable projectors and cameras as described in [10]. Figure 6: virtual shadows projected onto a building Ambient sound fields as public projections Like shadow projections, this form of interface uses devices embedded within the physical environment to provide some awareness of the activity within the parallel virtual world. Whereas virtual shadows use abstract projected graphics, the ambient sound field creates a (one-way or two-way) spatialised link between the virtual and physical worlds. In our initial prototype we created a virtual model of an indoors location (our laboratory space) and used 3 desktop PCs, each driving two speakers, to render the from three locations in the virtual world into the corresponding locations in the physical space. As the avatars of on-line users moved around the virtual laboratory their activity was played out from the corresponding speakers into the physical laboratory. Clearly these seven prototypes constitute a diverse collection of interfaces. Table 1 (over) summarizes them in terms of their technical operation and realization (PW = Physical World, VW = Virtual World). AALSIS By considering these various interfaces we can identify a broad range of factors that influence the potential utility and applicability of each. In turn, this requires a deeper exploration of the design constraints and design goals of augmented reality in the physical world. As already noted, Azuma [2] raises a number of practical constraints: - Tracking having variable accuracy. - Portability being limited portability is largely determined by a tradeoff between processing capability and power requirements, which in turn dictates size. There are also other hard constraints, for example it has only recently become possible to buy laptops with 3D graphics acceleration and this is still not available for PDAs. - Displays being hard to read in sunlight. To these can be added other issues related to the physical environment of use: - oise, e.g. from traffic or other people, interferes with -based presentations. - Weather in our particular locations, even during the height of Summer, we experienced rain on more than half of the days in which we worked outdoors. In other regions humidity and/or temperature (low or high) must also be considered. Further constraints arise when working both indoors and outdoors: - GPS typically only works outdoors (although indoor solutions are available, but not yet widely used), so multiple tracking solutions must be used in concert. For applications that include real-time information and/or collaboration (which is central to the development of a large scale augmented experience) we must also consider:

5 Payphone Mobile phone Hand-held radar Handheld sonification Augurscope Virtual shadows Ambient CPU - Phone PDA PDA Hi-spec Laptop PC(s) PC(s) etwork PST GSM, SMS WaveLA* WaveLA* Optional WaveLA Wired (Ethernet) Power Wired/phone Battery Battery Battery Battery Mains or generator Mobility Hand-held Hand-held Hand-held Mobile/ Wheeled Tracking Known position Optional GPS GPS* GPS* GPS*, compass, tilt, internal turn PW Media Mono Mono 2D graphics Mono 3D graphics and VW data to PW PW data to VW VW Presentation positions Distances 3D graphics and - (future: ) Fixed object (plus ) position (GPS) (future: ) Avatar (plus ) position Mobile avatar position and Mobile avatar plus position, orientation, tilt, zoom. Mobile avatar PW users or small group Interaction Check/ listen (PW) Check/ listen, or (VW) notify Fixed (or van) Known position* Abstract 3D graphics Simple graphics - - Optional virtual cameras Large group Search Search Explore, view Be(come) aware * As implemented; other approaches are possible (e.g. GSM or GPRS for networking; Indoor ultrasonic tracking). Table 1. Technical choices embodied in the prototype interfaces. Wired (Ethernet) Mains or generator Fixed (or van) Known position* Fixed, infrastructure Multichannel Optional virtual mics Large group Be(come) aware - etworking for wireless networking, as with fixed networking, there is generally an inverse relationship between distance and bandwidth. For example WaveLA may provide several Mbits/second bandwidth, but only over a very limited distance of at best a couple of hundred meters, and is also subject to interference and occlusion. Wide area networking technologies such as SMS, GSM and GPRS offer much more limited bandwidth, and some (e.g. GPRS) are also only beginning to be available commercially. As well as these environmental and technical issues, we can also identify broader issues that become significant when you wish to place devices in the physical world to be used by the general public as part of an augmented experience. - Use of bespoke versus commodity technologies and, related to this, the use of technologies that are already in place (e.g. owned by potential users) versus technologies must be acquired or provided specifically to support an application or trial. Use of commodity technologies already in place allows an augmented experience to be made available to many more users than a system that requires users (or organizers) to purchase entirely new devices. - Use of mobile versus embedded technologies, i.e. a user carrying a device that augments their own activities versus placing devices within the built or found environment that augment the activities taking place within those locations. ote that mobile approaches presume that users can be entrusted with the relevant devices (if they do not already own them), while embedded approaches presume that it is possible

6 and permitted to embed devices within the physical environment. - Supporting different numbers of users, ranging from an individual, through a small group to a larger group or crowd. - Supporting different relationship with the augmented experience ranging from total un-involvement (and potential ignorance, e.g. a random passer-by on a city street), through various levels of engagement and commitment, to full involvement. This relationship varies between users, but may also vary over time for a single user, as their interests and other activities change. Payphone Mobile phone Hand-held radar Hand-held sonification Augurscope Virtual shadows Ambient o tracking/ Low-res - (rough) (rough) Portable (limited) (or van) (or van) Sunlight/ Dark (but why) oise (no ) Rain o network/ Low BW In place/ Commodity Mobile/ Embedded Crowd/ Small group Uninvolved/ Joining PST only (preset) * (preset) * Standalone -!! * (part) (part) - (part) (part) Embedded Mobile Mobile Mobile Part-Mobile Embedded Embedded (ethics) (if given) * in principle, although not demonstrated in the prototypes to date. Table 2. Issues addressed by each interface. (if given) (if found ) (ditto) Table 2 shows how the seven interfaces described in this paper relate to these design constraints and goals. The simple or ideal situation for each issue (not shown) is as follows: high-resolution tracking; non-portable; artificial lighting; quiet; dry; high bandwidth network (wired or wireless); bespoke technology; single user; fully involved. This is the simplest context to address, and also the one that prevails under typical lab conditions! ote that none of the approaches can address all of the possible contexts of use. From this table we can identify a number of key strengths of the various interfaces described in this paper: - Use of in-place commodity technologies, such as fixed and mobile telephones, supports involvement of potentially large numbers of users at little additional cost. These technologies could also be used to make contact with potential users who are not yet engaged with a particular augmented experience. - Simple hand-held devices supporting abstract displays (such as the digital activity meters) can cope relatively well with reduced tracking accuracy and bandwidth. They may also support users who are in the process of becoming involved, with less cost and/or complexity than a general-purpose device or system. - Medium-scale devices, such as the augurscope, may support small groups better than smaller hand-held devices. The augurscope also exemplifies a device that is neither fully mobile nor fully embedded within a particular location. As such it might support different patterns of use. - Approaches relying on embedded devices, such as virtual shadows and ambient sound fields, associate technology with a place rather than a person. As such they avoid some issues of mobile technology such as tracking (at least of the devices themselves), trust (of the users carrying the devices), and the need for wireless networking. They are also particularly well suited to augmenting the experiences of people who

7 are otherwise uninvolved: simply passing through a particular (presumably public) space results in an augmented experience. In summary, each of our prototype interfaces fills a particular niche in delivering augmented reality experiences. COORDIATIG MULTIPLE ITERFACES However, it is not through these individual interfaces that the real advantage of this diversity will be realized. Rather, like the pieces of a jigsaw puzzle, they need to be fitted together to produce a larger picture. This section therefore explores how our interfaces can operate together in a coordinated way to deliver rich and coherent augmented reality experiences. Each device can be considered to create a window (or other more abstract channel) between the physical and the parallel virtual world. However each interface does this using different media and different kinds of devices, for different target users, and in different contexts of use. There are several reasons why we might wish many or all of the above interfaces (and multiple instances of each) to operate in a coordinated manner: - As already argued, each interface has its own unique capabilities and is best fitted to particular contexts of use. Consequently a user may want to move between different interfaces according their current activity, interest, etc. To do this, the interfaces must be appropriately coordinated. For example, if someone finds a virtual object or event using a digital activity meter then they may want to move to an augurscope to find out more about it. - Similarly, a user present in a space augmented with virtual shadows or ambient may expect these to be related to the other augmented experiences that are being accessed within the same physical space using personal mobile devices. - Some of the interfaces described are non-mobile (embedded), and users encounter them at different stages of their activities. Consequently the user and the activities that they are engaged in may be mobile even though the individual devices supporting that activity are fixed. In this case the various embedded devices must be coordinated with one another in order to provide a coherent experience for the user(s) moving between them. We are tackling this requirement for interface coordination in two ways. First, each interface augments the physical world with information from a common parallel virtual world that is conceptually overlaid onto the physical world as shown in figure 7. Virtual World Physical World Interfaces Figure 7: Augmenting the Physical World with data from a parallel Virtual World. The interfaces described in this paper all use MASSIVE-3 [6], a collaborative virtual environment system, to realize this parallel virtual world. Because MASSIVE-3 is networked, each device (provided it has adequate network connectivity) can connect to the same virtual world, and therefore reflect the same virtual activities on a momentby-moment basis. One way to look at this is to consider the MASSIVE-3 virtual world as a common contentdelivery channel, which, if used consistently, will yield a coordinated experience across all of the interfaces. MASSIVE-3 is tailored to supporting mobile avatars, virtual objects, real-time and video. Consequently these are the natural forms of content for augmented experiences generated using it. MASSIVE-3 also supports temporal links, which allow virtual activity to be recorded and replayed in flexible ways. This mechanism can be used to create pre-recorded material (scripted, generated or improvised) for repeated playback in the virtual world, and hence in the AR experience. ote that the MASSIVE-3 virtual world can also be used as a coordination and management facility. For example, a person with privileged access to the virtual world could use it to monitor physical and virtual activity (in so far as it is represented within the virtual world). They could then intervene remotely through the virtual world to tailor the augmented experience. This builds on studies of previous mixed reality experiences such as Desert Rain [8] that revealed how performers and crew orchestrate participants experiences in such a way as to engage them with the content and then to maintain that engagement throughout, even in the face of various difficulties and failures. The second way in which we support coordination of multiple interfaces is through the consistent use of EQUIP, which is a new platform for integrating various software components, IO devices and wireless and mobile interfaces. For example, all of the interfaces described in this paper use EQUIP as an intermediate layer between their component devices (e.g. GPS receiver, PDA,

8 accelerometer) and MASSIVE-3. EQUIP allows the various elements of the software system to be easily re-used and re-configured. This allows for fine-grained coordination and customization of the interfaces that would not be possible within the MASSIVE-3 virtual world. For example, this might be used to allow a future version of a digital activity meter to dock with an augurscope, causing both to reconfigure themselves into a compound interface. SUMMAR This paper has introduced a set of devices which provide different interfaces to link a virtual world with the physical world. We have suggested a set of properties that allowed us as designers to reason about the way in which these different devices can be used to realize an outdoors augmented experience for the general public. Two points are particularly worth emphasizing. First, delivering AR in the real world involves meeting many challenging and interdependent design goals and constraints. There is currently no single technology that is able to do this and so we propose broadening the focus of AR to make use of a far more diverse set of interfaces, each of which is appropriate to a particular use and environment. We have introduced seven prototype interfaces to illustrate some of the possible ways in which we might diversify the augmentation of the physical environment. Second, these devices have the potential to deliver much richer AR experiences if they are used in concert. This can be achieved by having them access a shared underlying virtual world that integrates, graphics, video and other kinds of digital information into a common framework and in which the presence and locations of multiple AR interfaces can be represented to support content creation and orchestration. We are currently developing a large-scale citywide AR experience to be staged in Our digital world is likely to be a representation of the city, but altered in various ways, for example shifted temp orally (into the apparent future or past), or distorted spatially (to meet the physical city at various key locations). Participants on the streets of the city will see and hear this environment (and the avatars and objects within it) via interfaces such as those described here. The avatars might be played by actors and by other on-line participants, who would in turn be able to see and hear the participants in the physical city (at least at key locations and times). This project is still in its conceptual stages, although some of the interfaces presented are early outcomes from planning workshops. We expect it to provide a driving application for further development of these interfaces and approaches. REFERECES 1. Azuma, R. T., A Survey of Augmented Reality, Presence: Teleoperators and Virtual Environments, 6(4): , Aug Azuma, R., The Challenge of Making Augmented Reality Work Outdoors, In Mixed Reality: Merging Real and Virtual Worlds (uichi Ohta and Hideyuki Tamura, eds), 1999, Springer-Varlag. 3. Benelli, G., Bianchi, A., Marti, P., ot, E., Sennati, D. (1999a). HIPS: Hyper-Interaction within Physical Space, Proc. IEEE ICMCS99, Florence, June Benford, S., Bowers, J., et al., Unearthing virtual history: using diverse interfaces to reveal hidden virtual worlds, Proc. Ubicomp 2001, Atlanta, Cheverst, K., Davies,., Mitchell, K., Friday, A. and Efstratiou, Developing a Context -Aware Electronic Tourist Guide: Some Issues and Experiences, Proc. CHI 2000, 17-24, The Hague, etherlands, Greenhalgh, C., Purbrick J., and Snowdon, D., Inside MASSIVE-3: Flexible Support for Data Consistency and World Structuring, Proc. Third International Conference on Collaborative Virtual Environments (CVE2000), Sept 2000, San Francisco, USA, pp , ACM, ew ork. 7. Höllerer, T., Feiner, S., Terauchi, T., Rashid, G., Hallaway, D., Exploring MARS: Developing Indoor and Outdoor User Interfaces to a Mobile Augmented Reality System, Computers and Graphics, 23(6), Elsevier Publishers, Dec. 1999, pp Koleva, B., Taylor, I., Benford, S., et al., Row-Farr, J., Adams, M, Orchestrating a Mixed Reality Performance, Proc. CHI 2001, Seattle, April (verified Sept 2001) 10. Pinhanez, C., Using a Steerable Projector and Camera to Transform Surfaces into Interactive Displays, Proc. CHI 2001 Extended Abstracts, , April 2001.

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION CHYI-GANG KUO, HSUAN-CHENG LIN, YANG-TING SHEN, TAY-SHENG JENG Information Architecture Lab Department of Architecture National Cheng Kung University

More information

Civil Engineering Application for Virtual Collaborative Environment

Civil Engineering Application for Virtual Collaborative Environment ICAT 2003 December 3-5, Tokyo, JAPAN Civil Engineering Application for Virtual Collaborative Environment Mauricio Capra, Marcio Aquino, Alan Dodson, Steve Benford, Boriana Koleva-Hopkin University of Nottingham

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

School of Computer and Information Science

School of Computer and Information Science School of Computer and Information Science CIS Research Placement Report Augmented Reality on the Android Mobile Platform Jan-Felix Schmakeit Date: 08/11/2009 Supervisor: Professor Bruce Thomas Abstract

More information

VR-MOG: A Toolkit For Building Shared Virtual Worlds

VR-MOG: A Toolkit For Building Shared Virtual Worlds LANCASTER UNIVERSITY Computing Department VR-MOG: A Toolkit For Building Shared Virtual Worlds Andy Colebourne, Tom Rodden and Kevin Palfreyman Cooperative Systems Engineering Group Technical Report :

More information

Mixed Reality Architecture: Concept, Construction, Use

Mixed Reality Architecture: Concept, Construction, Use Mixed Reality Architecture: Concept, Construction, Use Holger Schnädelbach*, **, Alan Penn**, Steve Benford*, Boriana Koleva* *Mixed Reality Laboratory,**University College London Abstract Mixed Reality

More information

COMET: Collaboration in Applications for Mobile Environments by Twisting

COMET: Collaboration in Applications for Mobile Environments by Twisting COMET: Collaboration in Applications for Mobile Environments by Twisting Nitesh Goyal RWTH Aachen University Aachen 52056, Germany Nitesh.goyal@rwth-aachen.de Abstract In this paper, we describe a novel

More information

Experiencing a Presentation through a Mixed Reality Boundary

Experiencing a Presentation through a Mixed Reality Boundary Experiencing a Presentation through a Mixed Reality Boundary Boriana Koleva, Holger Schnädelbach, Steve Benford and Chris Greenhalgh The Mixed Reality Laboratory, University of Nottingham Jubilee Campus

More information

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS NSF Lake Tahoe Workshop on Collaborative Virtual Reality and Visualization (CVRV 2003), October 26 28, 2003 AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS B. Bell and S. Feiner

More information

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Bruce N. Walker and Kevin Stamper Sonification Lab, School of Psychology Georgia Institute of Technology 654 Cherry Street, Atlanta, GA,

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Hybrid physical-digital artefacts

Hybrid physical-digital artefacts CID-185 ISSN 1403-0721 Department of Numerical Analysis and Computer Science KTH Hybrid physical-digital artefacts SHAPE IST 2000-26069 Workpackage1 Deliverable D1.1 Steve Benford, Mike Fraser, Boriana

More information

Annotation Overlay with a Wearable Computer Using Augmented Reality

Annotation Overlay with a Wearable Computer Using Augmented Reality Annotation Overlay with a Wearable Computer Using Augmented Reality Ryuhei Tenmokuy, Masayuki Kanbara y, Naokazu Yokoya yand Haruo Takemura z 1 Graduate School of Information Science, Nara Institute of

More information

Chapter 3. Communication and Data Communications Table of Contents

Chapter 3. Communication and Data Communications Table of Contents Chapter 3. Communication and Data Communications Table of Contents Introduction to Communication and... 2 Context... 2 Introduction... 2 Objectives... 2 Content... 2 The Communication Process... 2 Example:

More information

The Disappearing Computer. Information Document, IST Call for proposals, February 2000.

The Disappearing Computer. Information Document, IST Call for proposals, February 2000. The Disappearing Computer Information Document, IST Call for proposals, February 2000. Mission Statement To see how information technology can be diffused into everyday objects and settings, and to see

More information

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate Immersive Training David Lafferty President of Scientific Technical Services And ARC Associate Current Situation Great Shift Change Drive The Need For Training Conventional Training Methods Are Expensive

More information

Augmented Reality and Unmanned Aerial Vehicle Assist in Construction Management

Augmented Reality and Unmanned Aerial Vehicle Assist in Construction Management 1570 Augmented Reality and Unmanned Aerial Vehicle Assist in Construction Management Ming-Chang Wen 1 and Shih-Chung Kang 2 1 Department of Civil Engineering, National Taiwan University, email: r02521609@ntu.edu.tw

More information

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Survey of Mobile Augmentation for Mobile Augmented Reality System A Survey of Mobile Augmentation for Mobile Augmented Reality System Mr.A.T.Vasaya 1, Mr.A.S.Gohil 2 1 PG Student, C.U.Shah College of Engineering and Technology, Gujarat, India 2 Asst.Proffesor, Sir Bhavsinhji

More information

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

Where On-Line Meets On-The-Streets: Experiences With Mobile Mixed Reality Games

Where On-Line Meets On-The-Streets: Experiences With Mobile Mixed Reality Games Ft. Lauderdale, Florida, USA April 5-10, 2003 Paper: People at Leisure: Social Mixed Reality Where On-Line Meets On-The-Streets: Experiences With Mobile Mixed Reality Games Martin Flintham, Rob Anastasi,

More information

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu Augmented Home Integrating a Virtual World Game in a Physical Environment Serge Offermans and Jun Hu Eindhoven University of Technology Department of Industrial Design The Netherlands {s.a.m.offermans,j.hu}@tue.nl

More information

Paint with Your Voice: An Interactive, Sonic Installation

Paint with Your Voice: An Interactive, Sonic Installation Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de

More information

PlaceLab. A House_n + TIAX Initiative

PlaceLab. A House_n + TIAX Initiative Massachusetts Institute of Technology A House_n + TIAX Initiative The MIT House_n Consortium and TIAX, LLC have developed the - an apartment-scale shared research facility where new technologies and design

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

VIP-Emulator: To Design Interactive Architecture for adaptive mixed Reality Space

VIP-Emulator: To Design Interactive Architecture for adaptive mixed Reality Space VIP-Emulator: To Design Interactive Architecture for adaptive mixed Reality Space Muhammad Azhar, Fahad, Muhammad Sajjad, Irfan Mehmood, Bon Woo Gu, Wan Jeong Park,Wonil Kim, Joon Soo Han, Yun Jang, and

More information

Description of and Insights into Augmented Reality Projects from

Description of and Insights into Augmented Reality Projects from Description of and Insights into Augmented Reality Projects from 2003-2010 Jan Torpus, Institute for Research in Art and Design, Basel, August 16, 2010 The present document offers and overview of a series

More information

Reach: Dynamic Textile Patterns for Communication and Social Expression

Reach: Dynamic Textile Patterns for Communication and Social Expression Reach: Dynamic Textile Patterns for Communication and Social Expression Margot Jacobs and Linda Worbin Interactive Institute, PLAY studio Hugo Grauers Gata 3b 412 96 Göteborg, Sweden {margot.jacobs}{linda.worbin}@tii.se

More information

A Mobile Device Interface Using Orientation Sensing For Electronic Games and Digital Cinema

A Mobile Device Interface Using Orientation Sensing For Electronic Games and Digital Cinema A Mobile Device Interface Using Orientation Sensing For Electronic Games and Digital Cinema Julian Bleecker Assistant Professor, Interactive Media Division, School of Cinema-TV & Research Fellow, Annenberg

More information

Chapter 1 Virtual World Fundamentals

Chapter 1 Virtual World Fundamentals Chapter 1 Virtual World Fundamentals 1.0 What Is A Virtual World? {Definition} Virtual: to exist in effect, though not in actual fact. You are probably familiar with arcade games such as pinball and target

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

New interface approaches for telemedicine

New interface approaches for telemedicine New interface approaches for telemedicine Associate Professor Mark Billinghurst PhD, Holger Regenbrecht Dipl.-Inf. Dr-Ing., Michael Haller PhD, Joerg Hauber MSc Correspondence to: mark.billinghurst@hitlabnz.org

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

Understanding Social Interaction in UbiComp Environments. Experiences from e-social Science Research Node, DReSS (www.ncess.ac.

Understanding Social Interaction in UbiComp Environments. Experiences from e-social Science Research Node, DReSS (www.ncess.ac. Understanding Social Interaction in UbiComp Environments Experiences from e-social Science Research Node, DReSS (www.ncess.ac.uk/digitalrecord) Developing new forms of Digital Records for e-social Science

More information

Is Food Scenery? Generative Situations in Urban Networked Photography

Is Food Scenery? Generative Situations in Urban Networked Photography Is Food Scenery? Generative Situations in Urban Networked Photography Andrea Moed, Daniela Rosner, Nancy Van House School of Information, University of California, Berkeley [amoeda, daniela, vanhouse]@ischool.berkeley.edu

More information

Mirrored Message Wall:

Mirrored Message Wall: CHI 2010: Media Showcase - Video Night Mirrored Message Wall: Sharing between real and virtual space Jung-Ho Yeom Architecture Department and Ambient Intelligence Lab, Interactive and Digital Media Institute

More information

GALILEO Research and Development Activities. Second Call. Area 3. Statement of Work

GALILEO Research and Development Activities. Second Call. Area 3. Statement of Work GALILEO Research and Development Activities Second Call Area 3 Innovation by Small and Medium Enterprises Statement of Work Rue du Luxembourg, 3 B 1000 Brussels Tel +32 2 507 80 00 Fax +32 2 507 80 01

More information

Mobile and Pervasive Game Technologies. Joel Ross ICS 62 05/19/2011

Mobile and Pervasive Game Technologies. Joel Ross ICS 62 05/19/2011 Mobile and Pervasive Game Technologies Joel Ross ICS 62 05/19/2011 jwross@uci.edu Reading Summary! Please answer the following questions: on a piece of paper: What do Ross et al. conclude about the relationship

More information

Authoring & Delivering MR Experiences

Authoring & Delivering MR Experiences Authoring & Delivering MR Experiences Matthew O Connor 1,3 and Charles E. Hughes 1,2,3 1 School of Computer Science 2 School of Film and Digital Media 3 Media Convergence Laboratory, IST University of

More information

RED DEER COLLEGE COURSE OUTLINE: Portrait Photography COURSE CRN#: PHTO 3010

RED DEER COLLEGE COURSE OUTLINE: Portrait Photography COURSE CRN#: PHTO 3010 RED DEER COLLEGE COURSE OUTLINE: Portrait Photography COURSE CRN#: PHTO 3010 INSTRUCTOR: Thomas W. Cooper Email: thomas.cooper@rdc.ab.ca Office hours: Will respond within 48 hours CLASSTIME: Self-paced,

More information

Trunking Information Control Console

Trunking Information Control Console Trunking Information Control Console One Touch Communication and Control In a TICC we can: Initiate a call in one touch Send a status in one touch Call a group of users in one touch See what type of call

More information

Anticipation in networked musical performance

Anticipation in networked musical performance Anticipation in networked musical performance Pedro Rebelo Queen s University Belfast Belfast, UK P.Rebelo@qub.ac.uk Robert King Queen s University Belfast Belfast, UK rob@e-mu.org This paper discusses

More information

ScrollPad: Tangible Scrolling With Mobile Devices

ScrollPad: Tangible Scrolling With Mobile Devices ScrollPad: Tangible Scrolling With Mobile Devices Daniel Fällman a, Andreas Lund b, Mikael Wiberg b a Interactive Institute, Tools for Creativity Studio, Tvistev. 47, SE-90719, Umeå, Sweden b Interaction

More information

Indoor Positioning with a WLAN Access Point List on a Mobile Device

Indoor Positioning with a WLAN Access Point List on a Mobile Device Indoor Positioning with a WLAN Access Point List on a Mobile Device Marion Hermersdorf, Nokia Research Center Helsinki, Finland Abstract This paper presents indoor positioning results based on the 802.11

More information

Title: The only game in town. Authors: Eric Legge-Smith, Grant McKenzie, Matt Duckham Affiliation: Department of Geomatics, University of Melbourne

Title: The only game in town. Authors: Eric Legge-Smith, Grant McKenzie, Matt Duckham Affiliation: Department of Geomatics, University of Melbourne Title: The only game in town. Authors: Eric Legge-Smith, Grant McKenzie, Matt Duckham Affiliation: Department of Geomatics, University of Melbourne Intro: The gaming market continues to hold huge potential

More information

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play Sultan A. Alharthi Play & Interactive Experiences for Learning Lab New Mexico State University Las Cruces, NM 88001, USA salharth@nmsu.edu

More information

My project is based on How museum installations could be combined with gesture technologies to make them more interactive.

My project is based on How museum installations could be combined with gesture technologies to make them more interactive. Project Summary My project is based on How museum installations could be combined with gesture technologies to make them more interactive. Research Topics Interactive gesture technology. How it has developed.

More information

State of Podcasting: 2018 A white paper from Authentic, A Podtrac Company

State of Podcasting: 2018 A white paper from Authentic, A Podtrac Company Is Podcasting Ready for Your Brand? State of Podcasting: 2018 A white paper from Authentic, A Podtrac Company Last update: May 2018 https://docs.google.com/document/d/15shv7ast-e78wgaelpl8hympfg2hto03vsy5_4bztfg/edit#heading=h.2lv52knphi88

More information

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism REPORT ON THE CURRENT STATE OF FOR DESIGN XL: Experiments in Landscape and Urbanism This report was produced by XL: Experiments in Landscape and Urbanism, SWA Group s innovation lab. It began as an internal

More information

Chapter 1 - Introduction

Chapter 1 - Introduction 1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over

More information

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Volume 117 No. 22 2017, 209-213 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Mrs.S.Hemamalini

More information

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction Creating a Collaborative Multi Touch Computer Aided Design Program Cole Anagnost, Thomas Niedzielski, Desirée Velázquez, Prasad Ramanahally, Stephen Gilbert Iowa State University { someguy tomn deveri

More information

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng. Multimedia Communications Research Laboratory University of Ottawa Ontario Research Network of E-Commerce www.mcrlab.uottawa.ca abed@mcrlab.uottawa.ca

More information

Interface Design V: Beyond the Desktop

Interface Design V: Beyond the Desktop Interface Design V: Beyond the Desktop Rob Procter Further Reading Dix et al., chapter 4, p. 153-161 and chapter 15. Norman, The Invisible Computer, MIT Press, 1998, chapters 4 and 15. 11/25/01 CS4: HCI

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

HPV TECHNOLOGIES, Inc Fitch Irvine, California Long Throw Planar Magnetic Speaker Applications

HPV TECHNOLOGIES, Inc Fitch Irvine, California Long Throw Planar Magnetic Speaker Applications HPV TECHNOLOGIES, Inc. 17752 Fitch Irvine, California 92614 Long Throw Planar Magnetic Speaker Applications Phone: 949-476-7000 Fax: 949-476-7010 www.getmad.com MAD LT-PMS SPEAKER FAMILY The LT-PMS (Long

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Uncle Roy All Around You: mixing games and theatre on the city streets

Uncle Roy All Around You: mixing games and theatre on the city streets Uncle Roy All Around You: mixing games and theatre on the city streets Martin Flintham, Rob Anastasi, Steve Benford, Adam Drozd, James Mathrick, Duncan Rowland, Mixed Reality Laboratory University of Nottingham

More information

What is Augmented Reality?

What is Augmented Reality? What is Augmented Reality? Well, this is clearly a good place to start. I ll explain what Augmented Reality (AR) is, and then what the typical applications are. We re going to concentrate on only one area

More information

A Quick Guide to ios 12 s New Measure App

A Quick Guide to ios 12 s New Measure App A Quick Guide to ios 12 s New Measure App Steve Sande For the past several years, Apple has been talking about AR augmented reality a lot. The company believes that augmented reality, which involves overlaying

More information

Putting It All Together: Computer Architecture and the Digital Camera

Putting It All Together: Computer Architecture and the Digital Camera 461 Putting It All Together: Computer Architecture and the Digital Camera This book covers many topics in circuit analysis and design, so it is only natural to wonder how they all fit together and how

More information

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances Florent Berthaut and Martin Hachet Figure 1: A musician plays the Drile instrument while being immersed in front of

More information

Autonomic gaze control of avatars using voice information in virtual space voice chat system

Autonomic gaze control of avatars using voice information in virtual space voice chat system Autonomic gaze control of avatars using voice information in virtual space voice chat system Kinya Fujita, Toshimitsu Miyajima and Takashi Shimoji Tokyo University of Agriculture and Technology 2-24-16

More information

# Grant Applicant Information. 2. CAMIT Project Title. Sra, Misha Council for the Arts at MIT. CAMIT Grants February 2016

# Grant Applicant Information. 2. CAMIT Project Title. Sra, Misha Council for the Arts at MIT. CAMIT Grants February 2016 Council for the Arts at MIT CAMIT Grants February 2016 Sra, Misha 235 Albany St. Cambridge, MA 02139, US 5127731665 sra@mit.edu Submitted: Feb 14 2016 10:50PM 1. Grant Applicant Information 1. Affiliation

More information

Applications of Temporal Links: Recording and Replaying Virtual Environments

Applications of Temporal Links: Recording and Replaying Virtual Environments Applications of Temporal Links: Recording and Replaying Virtual Environments Chris Greenhalgh, Martin Flintham, Jim Purbrick, Steve Benford School of Computer Science and IT University of Nottingham Email:

More information

Pixel v POTUS. 1

Pixel v POTUS. 1 Pixel v POTUS Of all the unusual and contentious artifacts in the online document published by the White House, claimed to be an image of the President Obama s birth certificate 1, perhaps the simplest

More information

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa VIRTUAL REALITY Introduction Emil M. Petriu SITE, University of Ottawa Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality HUMAN PERCEPTION OF

More information

Embodied Interaction Research at University of Otago

Embodied Interaction Research at University of Otago Embodied Interaction Research at University of Otago Holger Regenbrecht Outline A theory of the body is already a theory of perception Merleau-Ponty, 1945 1. Interface Design 2. First thoughts towards

More information

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1 Active Stereo Vision COMP 4102A Winter 2014 Gerhard Roth Version 1 Why active sensors? Project our own texture using light (usually laser) This simplifies correspondence problem (much easier) Pluses Can

More information

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses

More information

Roadblocks for building mobile AR apps

Roadblocks for building mobile AR apps Roadblocks for building mobile AR apps Jens de Smit, Layar (jens@layar.com) Ronald van der Lingen, Layar (ronald@layar.com) Abstract At Layar we have been developing our reality browser since 2009. Our

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

AUGMENTED REALITY, FEATURE DETECTION Applications on camera phones. Prof. Charles Woodward, Digital Systems VTT TECHNICAL RESEARCH CENTRE OF FINLAND

AUGMENTED REALITY, FEATURE DETECTION Applications on camera phones. Prof. Charles Woodward, Digital Systems VTT TECHNICAL RESEARCH CENTRE OF FINLAND AUGMENTED REALITY, FEATURE DETECTION Applications on camera phones Prof. Charles Woodward, Digital Systems VTT TECHNICAL RESEARCH CENTRE OF FINLAND AUGMENTED REALITY (AR) Mixes virtual objects with view

More information

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc. Leddar optical time-of-flight sensing technology, originally discovered by the National Optics Institute (INO) in Quebec City and developed and commercialized by LeddarTech, is a unique LiDAR technology

More information

Omni-Directional Catadioptric Acquisition System

Omni-Directional Catadioptric Acquisition System Technical Disclosure Commons Defensive Publications Series December 18, 2017 Omni-Directional Catadioptric Acquisition System Andreas Nowatzyk Andrew I. Russell Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

Information Spaces Building Meeting Rooms in Virtual Environments

Information Spaces Building Meeting Rooms in Virtual Environments Information Spaces Building Meeting Rooms in Virtual Environments Drew Harry MIT Media Lab 20 Ames Street Cambridge, MA 02139 USA dharry@media.mit.edu Judith Donath MIT Media Lab 20 Ames Street Cambridge,

More information

User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure

User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure User Experience of Physical-Digital Object Systems: Implications for Representation and Infrastructure Les Nelson, Elizabeth F. Churchill PARC 3333 Coyote Hill Rd. Palo Alto, CA 94304 USA {Les.Nelson,Elizabeth.Churchill}@parc.com

More information

Design and Study of an Ambient Display Embedded in the Wardrobe

Design and Study of an Ambient Display Embedded in the Wardrobe Design and Study of an Ambient Display Embedded in the Wardrobe Tara Matthews 1, Hans Gellersen 2, Kristof Van Laerhoven 2, Anind Dey 3 1 University of California, Berkeley 2 Lancaster University 3 Intel-Berkeley

More information

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass Klen Čopič Pucihar School of Computing and Communications Lancaster University Lancaster, UK LA1 4YW k.copicpuc@lancaster.ac.uk Paul

More information

Fig.1 AR as mixed reality[3]

Fig.1 AR as mixed reality[3] Marker Based Augmented Reality Application in Education: Teaching and Learning Gayathri D 1, Om Kumar S 2, Sunitha Ram C 3 1,3 Research Scholar, CSE Department, SCSVMV University 2 Associate Professor,

More information

HeroX - Untethered VR Training in Sync'ed Physical Spaces

HeroX - Untethered VR Training in Sync'ed Physical Spaces Page 1 of 6 HeroX - Untethered VR Training in Sync'ed Physical Spaces Above and Beyond - Integrating Robotics In previous research work I experimented with multiple robots remotely controlled by people

More information

NAVIGATION TECHNIQUES IN AUGMENTED AND MIXED REALITY: CROSSING THE VIRTUALITY CONTINUUM

NAVIGATION TECHNIQUES IN AUGMENTED AND MIXED REALITY: CROSSING THE VIRTUALITY CONTINUUM Chapter 20 NAVIGATION TECHNIQUES IN AUGMENTED AND MIXED REALITY: CROSSING THE VIRTUALITY CONTINUUM Raphael Grasset 1,2, Alessandro Mulloni 2, Mark Billinghurst 1 and Dieter Schmalstieg 2 1 HIT Lab NZ University

More information

Context-Aware Interaction in a Mobile Environment

Context-Aware Interaction in a Mobile Environment Context-Aware Interaction in a Mobile Environment Daniela Fogli 1, Fabio Pittarello 2, Augusto Celentano 2, and Piero Mussio 1 1 Università degli Studi di Brescia, Dipartimento di Elettronica per l'automazione

More information

Projection Based HCI (Human Computer Interface) System using Image Processing

Projection Based HCI (Human Computer Interface) System using Image Processing GRD Journals- Global Research and Development Journal for Volume 1 Issue 5 April 2016 ISSN: 2455-5703 Projection Based HCI (Human Computer Interface) System using Image Processing Pankaj Dhome Sagar Dhakane

More information

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY Construction Informatics Digital Library http://itc.scix.net/ paper w78-1996-89.content VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY Bouchlaghem N., Thorpe A. and Liyanage, I. G. ABSTRACT:

More information

Augmented Reality in Transportation Construction

Augmented Reality in Transportation Construction September 2018 Augmented Reality in Transportation Construction FHWA Contract DTFH6117C00027: LEVERAGING AUGMENTED REALITY FOR HIGHWAY CONSTRUCTION Hoda Azari, Nondestructive Evaluation Research Program

More information

Adding Context Information to Digital Photos

Adding Context Information to Digital Photos Adding Context Information to Digital Photos Paul Holleis, Matthias Kranz, Marion Gall, Albrecht Schmidt Research Group Embedded Interaction University of Munich Amalienstraße 17 80333 Munich, Germany

More information

A Survey on Smart City using IoT (Internet of Things)

A Survey on Smart City using IoT (Internet of Things) A Survey on Smart City using IoT (Internet of Things) Akshay Kadam 1, Vineet Ovhal 2, Anita Paradhi 3, Kunal Dhage 4 U.G. Student, Department of Computer Engineering, SKNCOE, Pune, Maharashtra, India 1234

More information

Designing Semantic Virtual Reality Applications

Designing Semantic Virtual Reality Applications Designing Semantic Virtual Reality Applications F. Kleinermann, O. De Troyer, H. Mansouri, R. Romero, B. Pellens, W. Bille WISE Research group, Vrije Universiteit Brussel, Pleinlaan 2, 1050 Brussels, Belgium

More information

Game Design 2. Table of Contents

Game Design 2. Table of Contents Course Syllabus Course Code: EDL082 Required Materials 1. Computer with: OS: Windows 7 SP1+, 8, 10; Mac OS X 10.8+. Windows XP & Vista are not supported; and server versions of Windows & OS X are not tested.

More information

Playware Research Methodological Considerations

Playware Research Methodological Considerations Journal of Robotics, Networks and Artificial Life, Vol. 1, No. 1 (June 2014), 23-27 Playware Research Methodological Considerations Henrik Hautop Lund Centre for Playware, Technical University of Denmark,

More information

Cartography FieldCarto_Handoff.indb 1 4/27/18 9:31 PM

Cartography FieldCarto_Handoff.indb 1 4/27/18 9:31 PM Cartography FieldCarto_Handoff.indb 1 Abstraction and signage All maps are the result of abstraction and the use of signage to represent phenomena. Because the world around us is a complex one, it would

More information

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development

Using Mixed Reality as a Simulation Tool in Urban Planning Project for Sustainable Development Journal of Civil Engineering and Architecture 9 (2015) 830-835 doi: 10.17265/1934-7359/2015.07.009 D DAVID PUBLISHING Using Mixed Reality as a Simulation Tool in Urban Planning Project Hisham El-Shimy

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY

PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY PUZZLAR, A PROTOTYPE OF AN INTEGRATED PUZZLE GAME USING MULTIPLE MARKER AUGMENTED REALITY Marcella Christiana and Raymond Bahana Computer Science Program, Binus International-Binus University, Jakarta

More information

Physical Interaction and Multi-Aspect Representation for Information Intensive Environments

Physical Interaction and Multi-Aspect Representation for Information Intensive Environments Proceedings of the 2000 IEEE International Workshop on Robot and Human Interactive Communication Osaka. Japan - September 27-29 2000 Physical Interaction and Multi-Aspect Representation for Information

More information

Context Information vs. Sensor Information: A Model for Categorizing Context in Context-Aware Mobile Computing

Context Information vs. Sensor Information: A Model for Categorizing Context in Context-Aware Mobile Computing Context Information vs. Sensor Information: A Model for Categorizing Context in Context-Aware Mobile Computing Louise Barkhuus Department of Design and Use of Information Technology The IT University of

More information