TREC: Platform-Neutral Input for Mobile Augmented Reality Applications

Size: px
Start display at page:

Download "TREC: Platform-Neutral Input for Mobile Augmented Reality Applications"

Transcription

1 TREC: Platform-Neutral Input for Mobile Augmented Reality Applications Jason Kurczak and T.C. Nicholas Graham School of Computing, Queens University Kingston, Canada K7L 3N6 {kurczak, ABSTRACT Development of Augmented Reality (AR) applications can be time consuming due to the effort required in accessing sensors for location and orientation tracking data. In this paper, we introduce the TREC framework, designed to handle sensor input and make AR development easier. It does this in three ways. First, TREC generates a high-level abstraction of user location and orientation, so that low-level sensor data need not be seen directly. TREC also automatically uses the best available sensors and fusion algorithms so that complex configuration is unnecessary. Finally, TREC enables extensions of the framework to add support for new devices or customized sensor fusion algorithms. Author Keywords augmented reality, tracking sensors, input framework, sensor fusion ACM Classification Keywords H.5.1 Information Interfaces and Presentation: Artificial, augmented and virtual realities General Terms Design INTRODUCTION Mobile Augmented Reality (mobile AR) is rapidly making inroads in the consumer market, with applications such as Car Finder [1] and Layar [2] being released for mobile phone platforms. These applications use the phone s camera and screen to superimpose information on a video feed of the real world, and have served to demonstrate the potential of mobile AR to the general public. Mobile AR applications use sensors such as GPS, compass, and inertial sensors to determine the physical location and orientation of the device. The problem for mobile AR developers is that such sensors may be unreliable and noisy. Applications often must fuse inputs from multiple sensors to Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. EICS 11, June 13 16, 2011, Pisa, Italy. Copyright 2011 ACM /11/06...$ Figure 1. Using the tourist application determine accurate values. Even when a framework is used to handle the input devices, it can require complex manual configuration and be hard to extend or modify. These problems require the developer to focus on low-level sensor programming rather than focusing on the functionality and usability of the application. In this paper, we present an architecture for input frameworks designed to help address these issues, which is implemented in the TREC (TRacking using Extensible Components) framework. The advantages of this architecture are illustrated by its use in the Noisy Planet application described in the Motivating Example and Applying TREC sections. The main advantages provided by TREC s architecture include giving programmers open access to a hierarchy of devices, transformative modules, and high-level abstracted interfaces; being able to dynamically select sensors and sensor fusion algorithms thanks to multiple levels of abstraction; and allowing modification and extension at every level. MOTIVATING EXAMPLE: AN AR TOURIST APPLICATION To provide context to our description of TREC, we introduce Noisy Planet, a mobile tourist guide. This application, shown in figures 1 and 2, has been implemented using TREC. Noisy Planet allows a tourist staying in an unfamiliar city to navigate to proximate destinations on foot while also allowing serendipitous exploration of the area. 283

2 GPS devices triangulate their position using signals from orbiting satellites. Consumer devices have an accuracy of about 5-10m, depending on overhead visibility in outdoor environments [14]. Accelerometers can be used to track changes in position by calculating acceleration vectors based on the experienced forces, and integrating this data twice to obtain displacement [8]. Accelerometers provide faster updates and higher resolution updates than a GPS, but lack an absolute frame of reference and are subject to drift, therefore requiring periodic checks with some other absolute measure of position. Figure 2. Overhead view Noisy Planet uses 3D sound to convey to users the position and distance of nearby points of interest. For example, in figure 1, the user hears that the Stauffer Library is behind him and to his right. Each point of interest is represented by a subtle repeating tone for example, the sound of riffling pages represents the library; clinking glasses represent a restaurant, and jingling coins represent a bank. The application overlays an aural landscape over the physical world. The sounds emanate from the correct location, even as the user walks or turns his head. Since the sounds are subtle and repeating, the tourist can easily choose to listen to them or tune them out. The Noisy Planet implementation must track the user s location and head orientation so that sounds appear to emanate from the correct direction. It is challenging to accurately determine this information using off-the-shelf sensor equipment, and often multiple sensors are required to accurately estimate location and orientation. We currently use GPS, compass and gyroscope devices. Each device has its own data format, requiring programmers to write low lowlevel interfacing code. Each device type has limitations that cause it to deliver highly innacurate data under some circumstances. The programmer must therefore and must identify and code heuristics determining which device to use when they deliver conflicting information. As we shall see, our TREC framework helps with these difficulties. INPUT HANDLING IN MOBILE AR APPLICATIONS Before presenting the design of TREC, we first review current methods for obtaining input in mobile AR applications. We consider hardware support for location and orientation input, then review the state of the art in sensor fusion, and finally discuss existing input toolkits. Hardware for Location and Pose Detection A variety of off-the-shelf devices are available for estimating a user s location and pose in mobile contexts. Other methods for tracking position include triangulation of signals from known locations, such as cellular tower signals, wifi signals or ultrasonic transmitter systems [6]. These approaches suffer from limited coverage. Digital compasses (or magnetometers) detect orientation relative to magnetic north, but suffer serious drawbacks in accuracy. With the help of a 3-axis accelerometer to track the direction of earth s gravitational pull, a magnetometer can provide pitch, yaw, and roll data. Errors arise due to the lack of uniformity of the earth s magnetic field and its susceptibility to magnetic materials and artificial magnetic fields. In addition, outside forces (e.g., from walking) disturb the accelerometers measurement of the gravitational vector and can cause large deviations in measured orientation when walking. Gyroscopes measure angular displacement relative to some initial orientation, and so cannot indicate absolute direction on their own. However, unlike magnetometers, they are not affected by magnetic anomalies or by outside forces. Errors accumulated over time, however, will cause drift from the true direction. At high speeds of rotation (e.g., due to rapid head movement) some gyroscopes may exceed their upper limit of measurement and return wildly inaccurate results. Sensor Fusion Techniques Sensor fusion improves accuracy by combining data from multiple sensors [3]. A simple form of sensor fusion is to average the input of multiple sensors that are measuring the same property, in order to average out the noise from individual sensors (e.g. averaging the measurements of multiple anemometers to ascertain wind speed). More complex techniques take advantage of known properties of different sensor types. For example, a gyroscope, magnetometer, and accelerometer might be used in tandem, where the magnetometer is used to calibrate the gyroscope whenever the sensor is at rest. Another approach is to use a Kalman filter [13], which takes multiple noisy sensor measurements, estimates the error in these measurements, and then estimates the actual state of the system being measured [4]. Programming fusion algorithms requires iterative tuning based on deep knowledge of the properties of the underlying sensors. 284

3 Frameworks TREC is far from the first framework used to process input from tracking devices. VRPN provides an interface between input hardware and Virtual Reality (VR) applications. VRPN allows VR hardware peripherals to be shared across many computers on the same network, and simplifies development by providing a standardized interface for peripherals with the same functionality [12]. VRPN standardizes the sensor data being delivered to applications so that their code is not dependent on the sensors being used. It does not, however, dynamically choose which of the attached devices to use; this must be specified by the application developer. In terms of extensibility, VRPN permits the creation of new devices and device types, while also supporting layered devices that let the developer program higher-level behaviour based on input from one or more sensors. OpenTracker supports tracking hardware with a flexible and customizable architecture [10]. It uses dataflow graphs to manage data being passed from sensors to applications. Here, device drivers act as source nodes that bring data into the system; filter nodes transform, merge, or otherwise modify passed data from source nodes, and sink nodes output the data to an application. OpenTracker has a high level of configurability, using an XML schema to define the dataflow graph and supporting custom nodes. Like VRPN, however, OpenTracker does not offer automatic configuration and choice of devices, requiring the developer to provide a configuration file that describes the exact dataflow graph and devices to use. Ubitrack, on the other hand, is designed for automatic configuration [9]. It is meant to support large networks of sensors to provide AR tracking using all available resources, with completely dynamic configuration of dataflow networks based on Spatial Relationship Graphs (SRGs) of the sensors in the network. There does not appear to be any way to modify or extend the algorithm used to configure the dataflow networks, or to override this algorithm to use specific devices or configurations. The OpenInterface Project [11] has very interesting parallels in another area of HCI research. It is an open source platform for rapidly prototyping multimodal input interfaces for computer programs, with the benefit of a GUI interface. OpenInterface is designed to transform hardware input into a format suitable for the client application using modular transformation components, support a broad range of input devices, and be easily extensible. OpenInterface does not support the automatic selection of sensors and fusion algorithms, requiring explicit configurations by the developer like VRPN and OpenTracker. All of these AR frameworks support abstraction of sensor data to standard interfaces. VRPN, OpenTracker, and Open- Interface are extensible, allowing new sensor types to be easily added. Ubitrack provides automatic sensor configuration. None of these toolkits addresses all three of these important goals. The TREC framework has been designed specifically to fill this gap. THE TREC FRAMEWORK TREC is a software framework for input handling in mobile AR applications. TREC s goals are to reduce the time required to develop interfaces for hardware sensors, and to reduce the difficulty of adapting AR apps to work with a particular collection of sensors. Specifically, TREC addresses the problems identified in the last section where: 1) writing low-level interfaces to sensors distracts developers from the core application design; 2) changing the set of available sensors may break applications, requiring extensive recoding; 3) combining the input from different sensors is tricky, requiring experimentation and iteration First, in order to make the application programmer s job easier, TREC abstracts all sensor data into a high level representation of location and orientation. TREC provides the application programmer with simple interfaces for these two properties in its abstract input layer. Second, TREC automatically configures the sensors by determining at runtime which of the connected sensors can be used to provide the best data to the application. Because the TREC layered architecture hides all differences between sensor hardware, it can provide the application plug-andplay compatibility with different sensors. Finally, TREC uses sensor fusion algorithms automatically when a supported configuration is available, and the architecture makes it easy to extend the framework to support new algorithms. It uses a three-layer hierarchy (see figure 3) to abstract device details. This allows newly added sensors in the device layer to work automatically with existing applications, and fusion algorithms in the abstract device layer can take advantage of the abstracted devices automatically. The framework is open and allows access and modification at any level, meaning low-level sensor data can always be accessed directly if necessary. In summary, the 3-layered architecture of TREC allows it to provide open access to high and low levels of abstraction, makes it easier to support automatic configuration based on a hierarchy of device types, allows sensor fusion algorithms to be dynamically chosen in the same way as devices, and makes new devices and algorithms easily interoperable with existing code. The TREC Architecture The framework is structured around a three-layer architecture: the Abstract Input Layer, the Abstract Device Layer, and the Device Layer. The lowest layer, the device layer, contains objects that expose the data provided directly by device sensors. Devices must implement one one more abstract device interfaces (see below). For example, the OceanServer USB Compass provides both compass and accelerometer functionality. To add 285

4 Abstract input layer IOrientation ILocation +Location() : vector2 +Origin() : vector2 Abstract device layer CompassOrientation GyroscopeOrientation FusedOrientation GPSLocation +Location() : vector2 +Origin() : vector2 ICompass +Heading() : float IGyroscope +AngularVelocity() : float IAccelerometer +AccX() : float +AccY() : float +AccZ() : float IGPS +Time() : int +Status() : string +Latitude() : float +Longitude() : float +NewValue() : bool Device Layer OSCompass WTGyroscope IBlueGPS +Heading() : float +AccX() : float +AccY() : float +AccZ() : float «uses» +AngularVeclocity() : float «uses» «uses» +Time() : int +Status() : string +Latitude() : float +Longitude() : float +Bearing() : float +Speed() : float +NewValue() : bool OceanServer USB Compass WiTilt v3.0 Gyroscope iblue 737A+ GPS Figure 3. The TREC 3-layered architecture. a new device to the framework, a developer needs to create a device-layer class for the device. The abstract device layer groups standard types of sensor devices, and also provides virtual sensors by fusing the data from multiple concrete devices. For example, the CompassOrientation class provides orientation data from compasslike devices, such as OSCompass, while the GyroscopeOrientation class provides orientation data from gyroscope-like devices. Meanwhile, FusedOrientation provides orientation information by fusing data from a number of devices. Finally, the abstract input layer provides interfaces for specific input data types. In the current version of the framework, two types of input are defined: IOrientation provides head orientation data and ILocation provides position information. To access input, an application queries the TREC device manager for one of these types of input, and receives an object in return implementing the appropriate abstract input type. In this way, the TREC framework shields applications from the details of individual devices or fusion algorithms, and simplifies the work of the developer by providing a simple and uniform way to access input information. APPLYING TREC We now show, by example, how TREC helps both application programmers creating a mobile AR system, and systems programmers adding new functionality to TREC itself. The Application Programmer s Perspective To illustrate how TREC helps in developing mobile AR applications, we examine the implementation of the Noisy Planet tourist application. Users of the application carry a mobile device containing a GPS, and wear a gyroscope and compass for head tracking. These sensors are hidden behind TREC s IOrientation and ILocation abstract input types. To access the abstracted sensor data, the Noisy Planet code requests an appropriate data source from TREC s Device Manager. E.g., when an orientation data source is requested, the Device Manager provides an object that adheres to the IOrientation interface, with data coming from either the OS- 286

5 Compass or WTGyroscope device. The application cannot tell which device is supplying this data, and does not need to know since the incoming data is in a standardized format. This allows TREC to provide the best available sensor, and allows new devices to be added to TREC without impacting application code. If the programmer for some reason preferred a compass, he/she could choose to access the CompassOrientation object in the abstract device layer to guarantee the use of a compass, or even directly access the OSCompass in the device layer to choose that specific device. In addition to automatically choosing from available sensors, TREC can automatically use sensor fusion algorithms when sufficient sensors are available. Noisy Planet could actually receive a FusedOrientation object (from the abstract device layer) after the earlier request for an orientation object. This happens without any changes or configuration on the application side. TREC decides which device or combination of devices to use based on an internal rating mechanism, which is currently a hard-coded list of the known devices. For comparison, we implemented a version of Noisy Planet without access to TREC. This implementation required approximately lines of code per sensor (GPS, compass, gyroscope) to handle serial input from these devices, in addition to another 80 lines to manage these sensors and perform the sensor fusion. The version using TREC requires just two method calls: one to get the current orientation, and another for the current location. The Toolkit Developer s Perspective TREC is an open framework, making it straightforward to add support for new devices. We now discuss our experience in adding a new hardware and a new virtual device to TREC. Adding a New Hardware Device The custom code that connects to and processes data from a sensor is contained within a class at the device layer. Based on the type of sensor, accessor methods need to be created that will adhere to the proper interface at the abstract device layer (e.g., the IBlueGPS in figure 3 must implement the IGPS interface in order to automatically work with TREC applications). Adding a Simple Sensor Fusion Algorithm TREC s layered architecture allows easy integration of sensor fusion algorithms into applications, and allows the fusion code itself to be shielded from the details of the underlying devices. Sensor fusion algorithms are written as abstract device modules that adhere to a standard interface. Therefore, any program taking advantage of TREC s abstract input modules (IOrientation and ILocation) can automatically use the new algorithm. For example, to implement the FusedOrientation class of figure 3, a new abstract device class is created in the abstract device layer. This class implements the IOrientation interface, and therefore can be used whenever orientation information is requested by the application. Inside FusedOrientation, a gyroscope, compass and accelerometer are used (each of which are abstract devices), and some kind of location service is used (an abstract input.) Specifically, the gyroscope is used to determine orientation; the compass is used to calibrate the gyroscope from time to time, but not when the compass is moving (as determined via the accelerometer or from changes in location). This example shows how the fusion algorithm leverages TREC s layered architecture. When necessary, the fusion algorithm knows the kind of device that it is using (gyroscope, compass, etc.), but does not need to know exactly which device is in use. Furthermore, when it is not necessary to know the kind of device (i.e., for the location service), the appropriate abstract input can be used. DISCUSSION We have shown how TREC helps developers by providing a single standardized view of sensor data, making sensor selection and sensor fusion automatic, and letting developers easily extend the framework to support new devices or custom fusion algorithms. We now discuss broader questions related to TREC s design. Platform Standardization: Smartphone platforms are making sensors suitable for mobile AR applications broadly available. Exciting future work would involve customizing TREC around the sensor packages in common phones. The easy extensibility of TREC is important, as the sensors change between generations of phones, requiring updates to the framework. One limitation to this approach is that current phones do not provide sensors suitable for head-tracking. Computer vision for tracking: Computer vision is widely used for pose detection in AR applications (e.g., using AR- Toolkit [7]), but we have not discussed its use in TREC. Vision is less immediately useful in mobile applications, due to the difficulty of placing fiduciary markers in the outside environment. However, there is no inherent reason why vision could not be included as a sensor type within the framework. This would integrate into the TREC hierarchy just as any other sensor does, though requiring a new abstract device interface to be defined for vision systems. Extending to other kinds of AR: The framework as presented is heavily influenced by the ambient audio Noisy Planet application. In Noisy Planet, head tracking is important to overlay an audio soundscape onto the real world, and this head tracking need only be in a 2D plane. TREC can easily be extended to other forms of AR, however, simply by adding new device types. For example, one common form of mobile AR involves tracking the position and orientation of a handheld device (e.g., a Smartphone), so that its display can provide visual overlays onto the real world. Here, positioning information is detected for the device, not the head, using the sensors in the device. TREC s IOrientation interface would need to be extended to provide 3D positions. This change would require additional programming, but does not represent a fundamental change to TREC s design. 287

6 Sensor management: Future work with TREC includes improving the sophistication of its sensor management capabilities. Currently, TREC s Device Manager determines which among the available set of sensors to use by traversing a ranked list of known devices. An improved device manager would automatically determine the best set of a sensors to use, and would dynamically reconfigure the sensor set (e.g., in response to failure of one of the sensors.) This issue of sensor management has been more thoroughly covered in the Ubitrack system [9] and in the hybrid tracking AR system of Hallaway et al. [5]. Sensor fusion: The layered architecture of TREC permits AR applications to use sensor fusion automatically. An example was shown in the last section of how a simple sensor fusion algorithm could be implemented as an abstract device. More investigation is required to assess the difficulty of implementing more general fusion algorithms, than the one presented in the example. The algorithm used there does not depend on specific devices and will work with any sensors of the same type. A Kalman filter, however, may require unique knowledge about each sensor to create a good model of the system. In future work, we hope to investigate whether TREC can support these advanced algorithms without restricting their use to predetermined hardware. CONCLUSION In this paper we have presented TREC, a new framework for handling sensor input in mobile augmented reality applications. TREC has been designed to provide high-level abstraction of the sensor data, automatic configuration, and ease of extension and manual configuration when desired. While there are many other frameworks addressing these issues comprehensively, none provide all of these features together in a cohesive package. The key to TREC s success is its three-layer architecture. An abstract input layer provides high-level input types that completely abstract the underlying hardware. An abstract device layer collects classes of sensors (e.g., compass, accelerometer), as well as virtual devices implementing sensor fusion. Finally, a device layer provides interfaces to concrete devices. This architecture allows the abstraction of different hardware sensors into a uniform representation of location and orientation, and simplifies the automatic use of available sensors at runtime (including the automatic fusion of multiple devices). This hierarchy also makes the process of adding new devices straightforward, while providing compatibility with any applications already using TREC. The next steps for the TREC framework include adding broader support for hardware sensors, investigating the implementation of complex sensor fusion algorithms using the abstract devices in TREC, and implementing a robust sensormanagement algorithm to allow better dynamic configuration of the system. ACKNOWLEDGEMENTS This work was funded by NSERC Strategic Project # and by the GRAND Network of Centres of Excellence. We would like to thank Claire Joly for her feedback on using TREC, and Jonathan Segel for his contributions to the design of Noisy Planet. REFERENCES 1. Intridea Car Finder Layar Augmented Reality Browser B. Dasarathy. Sensor fusion potential exploitation-innovative architectures and illustrative applications. Proceedings of the IEEE, 85(1):24 38, E. Foxlin. Inertial head-tracker sensor fusion by a complementary separate-bias Kalman filter. In IEEE VRAIS, pages , 267, D. Hallaway, S. Feiner, and T. Höllerer. Bridging the gaps: Hybrid tracking for adaptive mobile augmented reality. Applied Artificial Intelligence, 25: , J. Hightower and G. Borriello. Location systems for ubiquitous computing. Computer, 34(8):57 66, H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, and K. Tachibana. Virtual object manipulation on a table-top AR environment. In ISAR, pages , P. Lang, A. Kusej, A. Pinz, and G. Brasseur. Inertial tracking for mobile augmented reality. In IMTC, volume 2, pages , D. Pustka, M. Huber, C. Waechter, F. Echtler, P. Keitler, J. Newman, D. Schmalstieg, and G. Klinker. Ubitrack: Automatic configuration of pervasive sensor networks for augmented reality. IEEE Pervasive Computing, preprint, June G. Reitmayr and D. Schmalstieg. Opentracker: A flexible software design for three-dimensional interaction. Virtual Reality, 9:79 92, M. Serrano, L. Nigay, J.-Y. L. Lawson, A. Ramsay, R. Murray-Smith, and S. Denef. The OpenInterface framework: a tool for multimodal interaction. In CHI extended abstracts on human factors in computing systems, pages , R. M. Taylor, II, T. C. Hudson, A. Seeger, H. Weber, J. Juliano, and A. T. Helser. VRPN: a device-independent, network-transparent VR peripheral system. In VRST, pages 55 61, G. Welch and G. Bishop. An introduction to the Kalman filter. Technical Report TR , Department of Computer Science, University of North Carolina at Chapel Hill, M. G. Wing. Consumer-grade global positioning system (GPS) accuracy and reliability. Journal of Forestry, 103: ,

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Marco Cavallo Merging Worlds: A Location-based Approach to Mixed Reality Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO Introduction: A New Realm of Reality 2 http://www.samsung.com/sg/wearables/gear-vr/

More information

Extended Kalman Filtering

Extended Kalman Filtering Extended Kalman Filtering Andre Cornman, Darren Mei Stanford EE 267, Virtual Reality, Course Report, Instructors: Gordon Wetzstein and Robert Konrad Abstract When working with virtual reality, one of the

More information

Introduction to Mobile Sensing Technology

Introduction to Mobile Sensing Technology Introduction to Mobile Sensing Technology Kleomenis Katevas k.katevas@qmul.ac.uk https://minoskt.github.io Image by CRCA / CNRS / University of Toulouse In this talk What is Mobile Sensing? Sensor data,

More information

HELPING THE DESIGN OF MIXED SYSTEMS

HELPING THE DESIGN OF MIXED SYSTEMS HELPING THE DESIGN OF MIXED SYSTEMS Céline Coutrix Grenoble Informatics Laboratory (LIG) University of Grenoble 1, France Abstract Several interaction paradigms are considered in pervasive computing environments.

More information

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality And Ubiquitous Computing using HCI Augmented Reality And Ubiquitous Computing using HCI Ashmit Kolli MS in Data Science Michigan Technological University CS5760 Topic Assignment 2 akolli@mtu.edu Abstract : Direct use of the hand as an input

More information

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness Alaa Azazi, Teddy Seyed, Frank Maurer University of Calgary, Department of Computer Science

More information

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots

CENG 5931 HW 5 Mobile Robotics Due March 5. Sensors for Mobile Robots CENG 5931 HW 5 Mobile Robotics Due March 5 Sensors for Mobile Robots Dr. T. L. Harman: 281 283-3774 Office D104 For reports: Read HomeworkEssayRequirements on the web site and follow instructions which

More information

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real... v preface Motivation Augmented reality (AR) research aims to develop technologies that allow the real-time fusion of computer-generated digital content with the real world. Unlike virtual reality (VR)

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living Javier Jiménez Alemán Fluminense Federal University, Niterói, Brazil jjimenezaleman@ic.uff.br Abstract. Ambient Assisted

More information

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011 Sponsored by Nisarg Kothari Carnegie Mellon University April 26, 2011 Motivation Why indoor localization? Navigating malls, airports, office buildings Museum tours, context aware apps Augmented reality

More information

ANDROID APPS DEVELOPMENT FOR MOBILE GAME

ANDROID APPS DEVELOPMENT FOR MOBILE GAME ANDROID APPS DEVELOPMENT FOR MOBILE GAME Lecture 5: Sensor and Location Sensor Overview Most Android-powered devices have built-in sensors that measure motion, orientation, and various environmental conditions.

More information

The Mixed Reality Book: A New Multimedia Reading Experience

The Mixed Reality Book: A New Multimedia Reading Experience The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut

More information

Wi-Fi Fingerprinting through Active Learning using Smartphones

Wi-Fi Fingerprinting through Active Learning using Smartphones Wi-Fi Fingerprinting through Active Learning using Smartphones Le T. Nguyen Carnegie Mellon University Moffet Field, CA, USA le.nguyen@sv.cmu.edu Joy Zhang Carnegie Mellon University Moffet Field, CA,

More information

Implementation of Augmented Reality System for Smartphone Advertisements

Implementation of Augmented Reality System for Smartphone Advertisements , pp.385-392 http://dx.doi.org/10.14257/ijmue.2014.9.2.39 Implementation of Augmented Reality System for Smartphone Advertisements Young-geun Kim and Won-jung Kim Department of Computer Science Sunchon

More information

Roadblocks for building mobile AR apps

Roadblocks for building mobile AR apps Roadblocks for building mobile AR apps Jens de Smit, Layar (jens@layar.com) Ronald van der Lingen, Layar (ronald@layar.com) Abstract At Layar we have been developing our reality browser since 2009. Our

More information

Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality

Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality Chi-Chung Alan Lo, Tsung-Ching Lin, You-Chiun Wang, Yu-Chee Tseng, Lee-Chun Ko, and Lun-Chia

More information

Augmented and mixed reality (AR & MR)

Augmented and mixed reality (AR & MR) Augmented and mixed reality (AR & MR) Doug Bowman CS 5754 Based on original lecture notes by Ivan Poupyrev AR/MR example (C) 2008 Doug Bowman, Virginia Tech 2 Definitions Augmented reality: Refers to a

More information

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science

More information

Aerospace Sensor Suite

Aerospace Sensor Suite Aerospace Sensor Suite ECE 1778 Creative Applications for Mobile Devices Final Report prepared for Dr. Jonathon Rose April 12 th 2011 Word count: 2351 + 490 (Apper Context) Jin Hyouk (Paul) Choi: 998495640

More information

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS GPS System Design and Control Modeling Chua Shyan Jin, Ronald Assoc. Prof Gerard Leng Aeronautical Engineering Group, NUS Abstract A GPS system for the autonomous navigation and surveillance of an airship

More information

Towards a framework for the rapid prototyping of physical interaction

Towards a framework for the rapid prototyping of physical interaction Towards a framework for the rapid prototyping of physical interaction Universidad Carlos III de Madrid Avenida de la Universidad 30, 28911, Leganés, Madrid, Spain abellucc@inf.uc3m.es, amalizia@inf.uc3m.es

More information

Ubiquitous Positioning: A Pipe Dream or Reality?

Ubiquitous Positioning: A Pipe Dream or Reality? Ubiquitous Positioning: A Pipe Dream or Reality? Professor Terry Moore The University of What is Ubiquitous Positioning? Multi-, low-cost and robust positioning Based on single or multiple users Different

More information

AR Glossary. Terms. AR Glossary 1

AR Glossary. Terms. AR Glossary 1 AR Glossary Every domain has specialized terms to express domain- specific meaning and concepts. Many misunderstandings and errors can be attributed to improper use or poorly defined terminology. The Augmented

More information

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg OughtToPilot Project Report of Submission PC128 to 2008 Propeller Design Contest Jason Edelberg Table of Contents Project Number.. 3 Project Description.. 4 Schematic 5 Source Code. Attached Separately

More information

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu

More information

Robust Positioning for Urban Traffic

Robust Positioning for Urban Traffic Robust Positioning for Urban Traffic Motivations and Activity plan for the WG 4.1.4 Dr. Laura Ruotsalainen Research Manager, Department of Navigation and positioning Finnish Geospatial Research Institute

More information

Augmented and Virtual Reality

Augmented and Virtual Reality CS-3120 Human-Computer Interaction Augmented and Virtual Reality Mikko Kytö 7.11.2017 From Real to Virtual [1] Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality visual displays. IEICE TRANSACTIONS

More information

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009 MIRACLE: Mixed Reality Applications for City-based Leisure and Experience Mark Billinghurst HIT Lab NZ October 2009 Looking to the Future Mobile devices MIRACLE Project Goal: Explore User Generated

More information

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station The platform provides a high performance basis for electromechanical system control. Originally designed for autonomous aerial vehicle

More information

Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU

Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU Improved Pedestrian Navigation Based on Drift-Reduced NavChip MEMS IMU Eric Foxlin Aug. 3, 2009 WPI Workshop on Precision Indoor Personnel Location and Tracking for Emergency Responders Outline Summary

More information

3DM -CV5-10 LORD DATASHEET. Inertial Measurement Unit (IMU) Product Highlights. Features and Benefits. Applications. Best in Class Performance

3DM -CV5-10 LORD DATASHEET. Inertial Measurement Unit (IMU) Product Highlights. Features and Benefits. Applications. Best in Class Performance LORD DATASHEET 3DM -CV5-10 Inertial Measurement Unit (IMU) Product Highlights Triaxial accelerometer, gyroscope, and sensors achieve the optimal combination of measurement qualities Smallest, lightest,

More information

10/18/2010. Focus. Information technology landscape

10/18/2010. Focus. Information technology landscape Emerging Tools to Enable Construction Engineering Construction Engineering Conference: Opportunity and Vision for Education, Practice, and Research Blacksburg, VA October 1, 2010 A. B. Cleveland, Jr. Senior

More information

Indoor localization using NFC and mobile sensor data corrected using neural net

Indoor localization using NFC and mobile sensor data corrected using neural net Proceedings of the 9 th International Conference on Applied Informatics Eger, Hungary, January 29 February 1, 2014. Vol. 2. pp. 163 169 doi: 10.14794/ICAI.9.2014.2.163 Indoor localization using NFC and

More information

Indoor Floorplan with WiFi Coverage Map Android Application

Indoor Floorplan with WiFi Coverage Map Android Application Indoor Floorplan with WiFi Coverage Map Android Application Zeying Xin Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2013-114 http://www.eecs.berkeley.edu/pubs/techrpts/2013/eecs-2013-114.html

More information

Indoor Localization and Tracking using Wi-Fi Access Points

Indoor Localization and Tracking using Wi-Fi Access Points Indoor Localization and Tracking using Wi-Fi Access Points Dubal Omkar #1,Prof. S. S. Koul *2. Department of Information Technology,Smt. Kashibai Navale college of Eng. Pune-41, India. Abstract Location

More information

Attitude and Heading Reference Systems

Attitude and Heading Reference Systems Attitude and Heading Reference Systems FY-AHRS-2000B Installation Instructions V1.0 Guilin FeiYu Electronic Technology Co., Ltd Addr: Rm. B305,Innovation Building, Information Industry Park,ChaoYang Road,Qi

More information

Satellite and Inertial Attitude. A presentation by Dan Monroe and Luke Pfister Advised by Drs. In Soo Ahn and Yufeng Lu

Satellite and Inertial Attitude. A presentation by Dan Monroe and Luke Pfister Advised by Drs. In Soo Ahn and Yufeng Lu Satellite and Inertial Attitude and Positioning System A presentation by Dan Monroe and Luke Pfister Advised by Drs. In Soo Ahn and Yufeng Lu Outline Project Introduction Theoretical Background Inertial

More information

Interior Design using Augmented Reality Environment

Interior Design using Augmented Reality Environment Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate

More information

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS Nuno Sousa Eugénio Oliveira Faculdade de Egenharia da Universidade do Porto, Portugal Abstract: This paper describes a platform that enables

More information

Cooperative localization (part I) Jouni Rantakokko

Cooperative localization (part I) Jouni Rantakokko Cooperative localization (part I) Jouni Rantakokko Cooperative applications / approaches Wireless sensor networks Robotics Pedestrian localization First responders Localization sensors - Small, low-cost

More information

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION

ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION Tweek: Merging 2D and 3D Interaction in Immersive Environments Patrick L Hartling, Allen D Bierbaum, Carolina Cruz-Neira Virtual Reality Applications Center, 2274 Howe Hall Room 1620, Iowa State University

More information

Enhancing Shipboard Maintenance with Augmented Reality

Enhancing Shipboard Maintenance with Augmented Reality Enhancing Shipboard Maintenance with Augmented Reality CACI Oxnard, CA Dennis Giannoni dgiannoni@caci.com (805) 288-6630 INFORMATION DEPLOYED. SOLUTIONS ADVANCED. MISSIONS ACCOMPLISHED. Agenda Virtual

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

INTRODUCTION TO VEHICLE NAVIGATION SYSTEM LECTURE 5.1 SGU 4823 SATELLITE NAVIGATION

INTRODUCTION TO VEHICLE NAVIGATION SYSTEM LECTURE 5.1 SGU 4823 SATELLITE NAVIGATION INTRODUCTION TO VEHICLE NAVIGATION SYSTEM LECTURE 5.1 SGU 4823 SATELLITE NAVIGATION AzmiHassan SGU4823 SatNav 2012 1 Navigation Systems Navigation ( Localisation ) may be defined as the process of determining

More information

Tracking in Unprepared Environments for Augmented Reality Systems

Tracking in Unprepared Environments for Augmented Reality Systems Tracking in Unprepared Environments for Augmented Reality Systems Ronald Azuma HRL Laboratories 3011 Malibu Canyon Road, MS RL96 Malibu, CA 90265-4799, USA azuma@hrl.com Jong Weon Lee, Bolan Jiang, Jun

More information

Sensing and Perception: Localization and positioning. by Isaac Skog

Sensing and Perception: Localization and positioning. by Isaac Skog Sensing and Perception: Localization and positioning by Isaac Skog Outline Basic information sources and performance measurements. Motion and positioning sensors. Positioning and motion tracking technologies.

More information

On Attitude Estimation with Smartphones

On Attitude Estimation with Smartphones On Attitude Estimation with Smartphones Thibaud Michel Pierre Genevès Hassen Fourati Nabil Layaïda Université Grenoble Alpes, INRIA LIG, GIPSA-Lab, CNRS March 16 th, 2017 http://tyrex.inria.fr/mobile/benchmarks-attitude

More information

Automated Virtual Observation Therapy

Automated Virtual Observation Therapy Automated Virtual Observation Therapy Yin-Leng Theng Nanyang Technological University tyltheng@ntu.edu.sg Owen Noel Newton Fernando Nanyang Technological University fernando.onn@gmail.com Chamika Deshan

More information

3DM-GX4-45 LORD DATASHEET. GPS-Aided Inertial Navigation System (GPS/INS) Product Highlights. Features and Benefits. Applications

3DM-GX4-45 LORD DATASHEET. GPS-Aided Inertial Navigation System (GPS/INS) Product Highlights. Features and Benefits. Applications LORD DATASHEET 3DM-GX4-45 GPS-Aided Inertial Navigation System (GPS/INS) Product Highlights High performance integd GPS receiver and MEMS sensor technology provide direct and computed PVA outputs in a

More information

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew

More information

Indoor Positioning 101 TECHNICAL)WHITEPAPER) SenionLab)AB) Teknikringen)7) 583)30)Linköping)Sweden)

Indoor Positioning 101 TECHNICAL)WHITEPAPER) SenionLab)AB) Teknikringen)7) 583)30)Linköping)Sweden) Indoor Positioning 101 TECHNICAL)WHITEPAPER) SenionLab)AB) Teknikringen)7) 583)30)Linköping)Sweden) TechnicalWhitepaper)) Satellite-based GPS positioning systems provide users with the position of their

More information

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018. Research Intern Director of Research We are seeking a summer intern to support the team to develop prototype 3D sensing systems based on state-of-the-art sensing technologies along with computer vision

More information

A Novel Tracking System for AAL Based on Smartphone Technology. DIEEI University of Catania, Italy

A Novel Tracking System for AAL Based on Smartphone Technology. DIEEI University of Catania, Italy A Novel Tracking System for AAL Based on Smartphone Technology DIEEI University of Catania, Italy Outline AAL research activities at DIEEI The proposed methodology for user tracking The proposed methodology

More information

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality Bruce N. Walker and Kevin Stamper Sonification Lab, School of Psychology Georgia Institute of Technology 654 Cherry Street, Atlanta, GA,

More information

Robotic Vehicle Design

Robotic Vehicle Design Robotic Vehicle Design Sensors, measurements and interfacing Jim Keller July 19, 2005 Sensor Design Types Topology in system Specifications/Considerations for Selection Placement Estimators Summary Sensor

More information

Interior Design with Augmented Reality

Interior Design with Augmented Reality Interior Design with Augmented Reality Ananda Poudel and Omar Al-Azzam Department of Computer Science and Information Technology Saint Cloud State University Saint Cloud, MN, 56301 {apoudel, oalazzam}@stcloudstate.edu

More information

School of Computer and Information Science

School of Computer and Information Science School of Computer and Information Science CIS Research Placement Report Augmented Reality on the Android Mobile Platform Jan-Felix Schmakeit Date: 08/11/2009 Supervisor: Professor Bruce Thomas Abstract

More information

INDOOR HEADING MEASUREMENT SYSTEM

INDOOR HEADING MEASUREMENT SYSTEM INDOOR HEADING MEASUREMENT SYSTEM Marius Malcius Department of Research and Development AB Prospero polis, Lithuania m.malcius@orodur.lt Darius Munčys Department of Research and Development AB Prospero

More information

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS

SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS SELECTING THE OPTIMAL MOTION TRACKER FOR MEDICAL TRAINING SIMULATORS What 40 Years in Simulation Has Taught Us About Fidelity, Performance, Reliability and Creating a Commercially Successful Simulator.

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

Ubiquitous Home Simulation Using Augmented Reality

Ubiquitous Home Simulation Using Augmented Reality Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL

More information

PHINS, An All-In-One Sensor for DP Applications

PHINS, An All-In-One Sensor for DP Applications DYNAMIC POSITIONING CONFERENCE September 28-30, 2004 Sensors PHINS, An All-In-One Sensor for DP Applications Yves PATUREL IXSea (Marly le Roi, France) ABSTRACT DP positioning sensors are mainly GPS receivers

More information

PERSONS AND OBJECTS LOCALIZATION USING SENSORS

PERSONS AND OBJECTS LOCALIZATION USING SENSORS Investe}te în oameni! FONDUL SOCIAL EUROPEAN Programul Operational Sectorial pentru Dezvoltarea Resurselor Umane 2007-2013 eng. Lucian Ioan IOZAN PhD Thesis Abstract PERSONS AND OBJECTS LOCALIZATION USING

More information

Intelligent Robotics Sensors and Actuators

Intelligent Robotics Sensors and Actuators Intelligent Robotics Sensors and Actuators Luís Paulo Reis (University of Porto) Nuno Lau (University of Aveiro) The Perception Problem Do we need perception? Complexity Uncertainty Dynamic World Detection/Correction

More information

ISCW 2001 Tutorial. An Introduction to Augmented Reality

ISCW 2001 Tutorial. An Introduction to Augmented Reality ISCW 2001 Tutorial An Introduction to Augmented Reality Mark Billinghurst Human Interface Technology Laboratory University of Washington, Seattle grof@hitl.washington.edu Dieter Schmalstieg Technical University

More information

Recent Progress on Augmented-Reality Interaction in AIST

Recent Progress on Augmented-Reality Interaction in AIST Recent Progress on Augmented-Reality Interaction in AIST Takeshi Kurata ( チョヌン ) ( イムニダ ) Augmented Reality Interaction Subgroup Real-World Based Interaction Group Information Technology Research Institute,

More information

Analysis of a Kalman Approach for a Pedestrian Positioning System in Indoor Environments

Analysis of a Kalman Approach for a Pedestrian Positioning System in Indoor Environments Analysis of a Kalman Approach for a Pedestrian Positioning System in Indoor Environments Edith Pulido Herrera 1, Ricardo Quirós 1, and Hannes Kaufmann 2 1 Universitat Jaume I, Castellón, Spain, pulido@lsi.uji.es,

More information

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER Nils Gageik, Thilo Müller, Sergio Montenegro University of Würzburg, Aerospace Information Technology

More information

ReVRSR: Remote Virtual Reality for Service Robots

ReVRSR: Remote Virtual Reality for Service Robots ReVRSR: Remote Virtual Reality for Service Robots Amel Hassan, Ahmed Ehab Gado, Faizan Muhammad March 17, 2018 Abstract This project aims to bring a service robot s perspective to a human user. We believe

More information

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES. Mark Billinghurst a, Hirokazu Kato b, Ivan Poupyrev c a Human Interface Technology Laboratory, University of Washington, Box 352-142, Seattle,

More information

Robotic Vehicle Design

Robotic Vehicle Design Robotic Vehicle Design Sensors, measurements and interfacing Jim Keller July 2008 1of 14 Sensor Design Types Topology in system Specifications/Considerations for Selection Placement Estimators Summary

More information

Technology Challenges and Opportunities in Indoor Location. Doug Rowitch, Qualcomm, San Diego

Technology Challenges and Opportunities in Indoor Location. Doug Rowitch, Qualcomm, San Diego PAGE 1 qctconnect.com Technology Challenges and Opportunities in Indoor Location Doug Rowitch, Qualcomm, San Diego 2 nd Invitational Workshop on Opportunistic RF Localization for Future Directions, Technologies,

More information

Virtual Reality Calendar Tour Guide

Virtual Reality Calendar Tour Guide Technical Disclosure Commons Defensive Publications Series October 02, 2017 Virtual Reality Calendar Tour Guide Walter Ianneo Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

GPS Waypoint Application

GPS Waypoint Application GPS Waypoint Application Kris Koiner, Haytham ElMiligi and Fayez Gebali Department of Electrical and Computer Engineering University of Victoria Victoria, BC, Canada Email: {kkoiner, haytham, fayez}@ece.uvic.ca

More information

NavShoe Pedestrian Inertial Navigation Technology Brief

NavShoe Pedestrian Inertial Navigation Technology Brief NavShoe Pedestrian Inertial Navigation Technology Brief Eric Foxlin Aug. 8, 2006 WPI Workshop on Precision Indoor Personnel Location and Tracking for Emergency Responders The Problem GPS doesn t work indoors

More information

Knowledge Acquisition and Representation in Facility Management

Knowledge Acquisition and Representation in Facility Management 2016 International Conference on Computational Science and Computational Intelligence Knowledge Acquisition and Representation in Facility Management Facility Management with Semantic Technologies and

More information

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Behaviour-Based Control. IAR Lecture 5 Barbara Webb Behaviour-Based Control IAR Lecture 5 Barbara Webb Traditional sense-plan-act approach suggests a vertical (serial) task decomposition Sensors Actuators perception modelling planning task execution motor

More information

IoT. Indoor Positioning with BLE Beacons. Author: Uday Agarwal

IoT. Indoor Positioning with BLE Beacons. Author: Uday Agarwal IoT Indoor Positioning with BLE Beacons Author: Uday Agarwal Contents Introduction 1 Bluetooth Low Energy and RSSI 2 Factors Affecting RSSI 3 Distance Calculation 4 Approach to Indoor Positioning 5 Zone

More information

Recent Progress on Wearable Augmented Interaction at AIST

Recent Progress on Wearable Augmented Interaction at AIST Recent Progress on Wearable Augmented Interaction at AIST Takeshi Kurata 12 1 Human Interface Technology Lab University of Washington 2 AIST, Japan kurata@ieee.org Weavy The goal of the Weavy project team

More information

SELF-BALANCING MOBILE ROBOT TILTER

SELF-BALANCING MOBILE ROBOT TILTER Tomislav Tomašić Andrea Demetlika Prof. dr. sc. Mladen Crneković ISSN xxx-xxxx SELF-BALANCING MOBILE ROBOT TILTER Summary UDC 007.52, 62-523.8 In this project a remote controlled self-balancing mobile

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

A Survey of Mobile Augmentation for Mobile Augmented Reality System

A Survey of Mobile Augmentation for Mobile Augmented Reality System A Survey of Mobile Augmentation for Mobile Augmented Reality System Mr.A.T.Vasaya 1, Mr.A.S.Gohil 2 1 PG Student, C.U.Shah College of Engineering and Technology, Gujarat, India 2 Asst.Proffesor, Sir Bhavsinhji

More information

A 3D Ubiquitous Multi-Platform Localization and Tracking System for Smartphones. Seyyed Mahmood Jafari Sadeghi

A 3D Ubiquitous Multi-Platform Localization and Tracking System for Smartphones. Seyyed Mahmood Jafari Sadeghi A 3D Ubiquitous Multi-Platform Localization and Tracking System for Smartphones by Seyyed Mahmood Jafari Sadeghi A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy

More information

Vicki Niu, MacLean Freed, Ethan Takla, Ida Chow and Jeffery Wang Lincoln High School, Portland, OR gmail.com

Vicki Niu, MacLean Freed, Ethan Takla, Ida Chow and Jeffery Wang Lincoln High School, Portland, OR gmail.com Vicki Niu, MacLean Freed, Ethan Takla, Ida Chow and Jeffery Wang Lincoln High School, Portland, OR Nanites4092 @ gmail.com Outline Learning STEM through robotics Our journey from FIRST LEGO League to FIRST

More information

Attitude Determination. - Using GPS

Attitude Determination. - Using GPS Attitude Determination - Using GPS Table of Contents Definition of Attitude Attitude and GPS Attitude Representations Least Squares Filter Kalman Filter Other Filters The AAU Testbed Results Conclusion

More information

Implementation of Kalman Filter on PSoC-5 Microcontroller for Mobile Robot Localization

Implementation of Kalman Filter on PSoC-5 Microcontroller for Mobile Robot Localization Journal of Communication and Computer 11(2014) 469-477 doi: 10.17265/1548-7709/2014.05 007 D DAVID PUBLISHING Implementation of Kalman Filter on PSoC-5 Microcontroller for Mobile Robot Localization Garth

More information

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,

More information

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Volume 117 No. 22 2017, 209-213 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY Mrs.S.Hemamalini

More information

Geo-Located Content in Virtual and Augmented Reality

Geo-Located Content in Virtual and Augmented Reality Technical Disclosure Commons Defensive Publications Series October 02, 2017 Geo-Located Content in Virtual and Augmented Reality Thomas Anglaret Follow this and additional works at: http://www.tdcommons.org/dpubs_series

More information

1 General Information... 2

1 General Information... 2 Release Note Topic : u-blox M8 Flash Firmware 3.01 UDR 1.00 UBX-16009439 Author : ahaz, yste, amil Date : 01 June 2016 We reserve all rights in this document and in the information contained therein. Reproduction,

More information

Indoor navigation with smartphones

Indoor navigation with smartphones Indoor navigation with smartphones REinEU2016 Conference September 22 2016 PAVEL DAVIDSON Outline Indoor navigation system for smartphone: goals and requirements WiFi based positioning Application of BLE

More information

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones. Capture The Flag: Engaging In A Multi- Device Augmented Reality Game Suzanne Mueller Massachusetts Institute of Technology Cambridge, MA suzmue@mit.edu Andreas Dippon Technische Universitat München Boltzmannstr.

More information

VEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu

More information

Utility of Sensor Fusion of GPS and Motion Sensor in Android Devices In GPS- Deprived Environment

Utility of Sensor Fusion of GPS and Motion Sensor in Android Devices In GPS- Deprived Environment Utility of Sensor Fusion of GPS and Motion Sensor in Android Devices In GPS- Deprived Environment Amrit Karmacharya1 1 Land Management Training Center Bakhundol, Dhulikhel, Kavre, Nepal Tel:- +977-9841285489

More information

ARDUINO BASED CALIBRATION OF AN INERTIAL SENSOR IN VIEW OF A GNSS/IMU INTEGRATION

ARDUINO BASED CALIBRATION OF AN INERTIAL SENSOR IN VIEW OF A GNSS/IMU INTEGRATION Journal of Young Scientist, Volume IV, 2016 ISSN 2344-1283; ISSN CD-ROM 2344-1291; ISSN Online 2344-1305; ISSN-L 2344 1283 ARDUINO BASED CALIBRATION OF AN INERTIAL SENSOR IN VIEW OF A GNSS/IMU INTEGRATION

More information

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets Technical Disclosure Commons Defensive Publications Series November 22, 2017 Face Cushion for Smartphone-Based Virtual Reality Headsets Samantha Raja Alejandra Molina Samuel Matson Follow this and additional

More information

idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology

idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology Final Proposal Team #2 Gordie Stein Matt Gottshall Jacob Donofrio Andrew Kling Facilitator: Michael Shanblatt Sponsor:

More information

Input devices and interaction. Ruth Aylett

Input devices and interaction. Ruth Aylett Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time

More information