Annotation Overlay with a Wearable Computer Using Augmented Reality

Similar documents
Nara Palace Site Navigator: A Wearable Tour Guide System Based on Augmented Reality

A Survey of Mobile Augmentation for Mobile Augmented Reality System

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Augmented Reality And Ubiquitous Computing using HCI

Augmented Reality: Its Applications and Use of Wireless Technologies

3D and Sequential Representations of Spatial Relationships among Photos

Agenda Motivation Systems and Sensors Algorithms Implementation Conclusion & Outlook

User Study on a Position- and Direction-aware Museum Guide using 3-D Maps and Animated Instructions

AR 2 kanoid: Augmented Reality ARkanoid

MOBILE AUGMENTED REALITY FOR SPATIAL INFORMATION EXPLORATION

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009

Tracking in Unprepared Environments for Augmented Reality Systems

Recent Progress on Wearable Augmented Interaction at AIST

Augmented Reality Mixed Reality

Augmented and mixed reality (AR & MR)

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

Implementation of Augmented Reality System for Smartphone Advertisements

Future Directions for Augmented Reality. Mark Billinghurst

A METHOD FOR DISTANCE ESTIMATION USING INTRA-FRAME OPTICAL FLOW WITH AN INTERLACE CAMERA

Development of Video Chat System Based on Space Sharing and Haptic Communication

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

Augmented Reality Applications for Nuclear Power Plant Maintenance Work

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

doc.: IEEE < >

Toward an Augmented Reality System for Violin Learning Support

Interior Design using Augmented Reality Environment

Scalable Architecture and Content Description Language for Mobile Mixed Reality Systems

Service Cooperation and Co-creative Intelligence Cycle Based on Mixed-Reality Technology

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects

Working towards scenario-based evaluations of first responder positioning systems

Implementation of Image processing using augmented reality

Sensor system of a small biped entertainment robot

Recognizing Words in Scenes with a Head-Mounted Eye-Tracker

Application of 3D Terrain Representation System for Highway Landscape Design

Road Stakeout In Wearable Outdoor Augmented Reality

Robust Positioning for Urban Traffic

Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Integrating CFD, VR, AR and BIM for Design Feedback in a Design Process An Experimental Study

Interactive guidance system for railway passengers

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

Augmented Reality and Its Technologies

Estimation of Folding Operations Using Silhouette Model

Augmented Reality in Transportation Construction

EnSight in Virtual and Mixed Reality Environments

Real life augmented reality for maintenance

VR/AR Concepts in Architecture And Available Tools

Head Tracking for Google Cardboard by Simond Lee

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

Recent Progress on Augmented-Reality Interaction in AIST

6Visionaut visualization technologies SIMPLE PROPOSAL 3D SCANNING

Chapter 1 - Introduction

Indoor Positioning with a WLAN Access Point List on a Mobile Device

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

The Seamless Localization System for Interworking in Indoor and Outdoor Environments

An Adaptive Indoor Positioning Algorithm for ZigBee WSN

Using BIM Geometric Properties for BLE-based Indoor Location Tracking

Active Stereo Vision. COMP 4102A Winter 2014 Gerhard Roth Version 1

Hardware-free Indoor Navigation for Smartphones

INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY. Augmented Reality-An Emerging Technology

NavShoe Pedestrian Inertial Navigation Technology Brief

A Hybrid Immersive / Non-Immersive

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Augmented and Virtual Reality

Resource-adaptive mobile Navigation Systems

Technical Disclosure Commons

Paper on: Optical Camouflage

B L E N e t w o r k A p p l i c a t i o n s f o r S m a r t M o b i l i t y S o l u t i o n s

Introduction to Virtual Reality (based on a talk by Bill Mark)

Demonstration of a Frequency-Demodulation CMOS Image Sensor

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS

Civil Engineering Application for Virtual Collaborative Environment

Image Sensor Communication for Patient ID Recognition Using Mobile Devices

A Training-assistance System using Mobile Augmented Reality for Outdoor-facility Inspection

Overview and Final Results of the MR Project

ISCW 2001 Tutorial. An Introduction to Augmented Reality

VIS-Tracker: A Wearable Vision-Inertial Self-Tracker

Augmented Reality- Effective Assistance for Interior Design

IoT. Indoor Positioning with BLE Beacons. Author: Uday Agarwal

Real Time Indoor Tracking System using Smartphones and Wi-Fi Technology

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

Augmented Reality Lecture notes 01 1

Mixed Reality technology applied research on railway sector

An augmented-reality (AR) interface dynamically

Virtual Eye for Blind People

Indoor navigation with smartphones

Context-Aware Interaction in a Mobile Environment

Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection

Directional Driver Hazard Advisory System. Benjamin Moore and Vasil Pendavinji ECE 445 Project Proposal Spring 2017 Team: 24 TA: Yuchen He

Analysis of Computer IoT technology in Multiple Fields

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING

Discussion on Red Tacton - Technology for communication through Human body

Natural Gesture Based Interaction for Handheld Augmented Reality

Development of Radio on Free Space Optics System for Ubiquitous Wireless

Interactions and Applications for See- Through interfaces: Industrial application examples

State of the Location Industry. Presented by Mappedin

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Transcription:

Annotation Overlay with a Wearable Computer Using Augmented Reality Ryuhei Tenmokuy, Masayuki Kanbara y, Naokazu Yokoya yand Haruo Takemura z 1 Graduate School of Information Science, Nara Institute of Science and Technology 8916-5 Takayama-cho, Ikoma-shi, Nara, 630-0101, JAPAN fryuhie-t,masay-ka, yokoyag@is.aist-nara.ac.jp zcybermedia Center, Osaka University takemura@cmc.osaka-u.ac.jp Abstract This paper describes an augmented reality system with a wearable computer to overlay an annotation relative to the real world. To realize an augmented reality system, a position of user s viewpoint is needed for acquiring a relationship between the real and virtual coordinate systems. However, it is difficult that the user s viewpoint is accurately measured in both in- and out-door scene. In this paper, a wearable augmented reality system which can overlay an annotation on real scene is developed by using RFID tag and tag reader. The system can measure user s position by RFID tag set up at appointed position where user thinks that it need an information relative to real scene; for example, a front of a guide plate and a blanch point. The feasibility of the system has been successfully demonstrated with experimental. Keywords: Wearable Computer, Augmented Reality, Annotation Overlay, RFID tag 1. Introduction Since computers made great strides in these year, a wearable computer which can be loaded is realized[5, 6]. At the same time, augmented reality which merges real and virtual worlds has received a great deal of attention as a new method for displaying information or increasing the reality of virtual environments[2]. If an augmented reality can be combined with an wearable computer, many applications, such as navigation guide of museum or a sightseeing areas can be realized. To realized an augmented reality system, a position and orientation of user s viewpoint are needed for acquiring a relationship between the real and virtual coordinate systems. This is called registration problem. However, it is difficult that to accurately measure the position of user s viewpoint with a wearable sensor on which can be used Figure 1. Appearance of the wearable augmented reality. both in- and out-door scenes. Magnetic trackers which are usually used for indoor AR or VR system is difficult to be used under wide range of measurement such as outdoor environment. A global positioning system (GPS) is suitable for outdoor use but cannot be used in indoor environment where satellites for the system cannot be directly viewed. In this paper, a wearable augmented reality system which can overlay an annotation on real scene is developed by using RFID tag and tag reader. The system detects user s approximate position by reading RFID tag placed at appointed position where user needs a information relative to real scene; for example, a front of guide plate and a branch point. The orientation of user s viewpoint is measured by using inertial sensor attached to user s head as shown in Figure 1. In addition, it estimates drift error which caused orientation error of inertial sensor by using images captured at user s viewpoint as shown in Figure 1. This paper is structured as follow. Section 2 describes the wearable augmented reality using RFID tag and tag reader. In Section 3, an experimental result with a prototype

outdoor RFID tag reader RFID tag indoor Figure 2. An example of position estimation using RFID tags. system is describes. In Section 4, an idea to replace RFID tag and with IRDA marker is described. Finally, Section 5 summarized the presented work. 2. Wearable Augmented Reality System In this section, we describe a proto type system of a wearable augmented reality system that can overlay information relative to the real scene. The system is developed so that the system can be used for navigation guiding system. We assume that a user of the system navigate outdoor and indoor environment. The system provides an appropriate direction to get to the user s goal by showing useful information in his/her view. 2.1. Localization using RFID tag In order to make the system usable both indoor and outdoor environment, we choose RFID tags that are embedded in the environment where the user of the system navigate. An RFID tag is small ID tag of which ID can be read by using Radio Communication. A reader that reads data out of a tag does not have to make any physical contact with the tag. Tag it self does not have any source of power and can be embedded in various place. Typically, tag can hold data of up to several kilobytes. In our implementation, an RFID tag reader attached to the user reads out ID information and detect where the user is located in the environment. One possible placement of RFID tags is to embed them in the floor, road, and pavement and put a reader in a sole of a user s shoe. This will give relatively dense position information of the user to the system. (Fig 3) Since RFID tag is a Read-Write device, a user of the system may leave their footprint in tags so that other user may trace him/her or so that statistic information such as how many people have visited the place can be automatically collected. A user may add some additional information of the place in RFID tag, such as rating of the spot such as rating of the restaurant at the spot. Such information may be shared by another user coming to the same place. In our prototype system, we place an RFID tag reader on a users wrist. Assuming that a user will explicitly reach for RFID tag placed in the experimental environment. 2.2. Hardware Configuration Figure 3 shows the hardware configuration of the proposed wearable augmented reality system. The system is structured from three parts; these are sensors, computer and display device. Sensors The system has following three sensors. The sensors transmit data to the computer through USB connection. RFID tag reader: The RFID tag reader (OMRON: V720) recognize RFID tag by reading a embedded data of each RFID tag. In the system, the user s position is determined by reading a RFID tag placed at appointed points. Inertial sensor: The inertial sensor (Intersence: Inter- Trax2) measures the orientation of user s view. CCD camera: The camera (IO data: USB-CCD) attached near user s viewpoint captures real scene which is similar to user s view. Computer The computer generated an annotation overlay images by determining the position and orientation of user s viewpoint using the data from the sensors. In this system, notebook PC (DELL: Inspiron8100 CPU PentiumIII 1.2GHz, 512Mbytes memory) is used. This PC is placed user has this in a shoulder bag. The computer has a database of annotation information. Display device Small display device (MicroOptical: VGA Clip on display) is attached to a glasses as shown in the Figure 3. User can see annotation overlay image by this device. 2.3. Algorithm for Generating Annotation Overlay Images Figure 4 shows the flowchart of the proposed system. A data of the sensors are fed into a computer. First, user s position is determined using RFID tag reader. Next, since the data of an inertial sensor includes a drift error, user s orientation is estimated by using the data from inertial sensor.

CCD Camera Real World Scene ««««Tag Reader Inertial Sensor Position Orientation Display Device Computer Database of Annotation Information Annotation Overlaid Image Figure 3. Hardware configuration of the wearable augmented reality. Determination of user s position Estimation of drift error Detection of natural features Feature tracking Calculation of the drift error Annotation overlay Figure 4. Flow diagram of the prototype system. Finally, annotation overlay image is generated using the position and orientation of user s viewpoint and is output to the display device that is worn by the user. 2.3.1. Determination of user s position To acquire the user s position, the user reads RFID tag which is placed at a guide plate or a blanch point using tag reader as shown in Figure 5. Therefore, location ID em- Figure 5. RFID tag and tag reader. bedded the RFID tag can be recognized, and the annotation information is selected from the database. 2.3.2. Estimation of drift error A measurement data of inertial sensor usually includes accumulation error that is called drift error. A drift error is caused by temperature change and other reasons. In our system, the drift error is canceled by estimating amount of drift from the image captured by camera. The detail of the calculation process is as follows. step 1 In the first frame, natural features in a capture image are collected by using Harris s interest operator [7] which detectes corners or cross-points of edges. The position of collected features are recorded. step 2 In the second and later frame, every features found in step 1 are tracked using Harris s operator which is applied inside of a searching window placed around

(a) Main gate (b) Entrance (c) Hallway Figure 6. Annotation overlay images. each feature point in previous frames. This step is repeated until there are two consecutive frames where majority of positions of features is stable. When such a condition is satisfied, we assume that the camera did not move during these two frame time. step 3 When stable two consecutive frames are found, we assume that a users head did not move in there two frames. Therefore, we assume that the difference of data from inertial sensor between these frames is a constant drift error. 2.3.3. Annotation overlay image generation This step is repeated when a user stays near the appointed position where an RFID tag is placed. User s position is determined by using the method in 2.3.1. User s orientation is measured by the method described in 2.3.2. 3. Experiment In this section, the experiment using the proposed wearable augmented reality system is described. We develop a navigation system which three tags placed to appointed positions at our school. The experiment is performed to guide a user coming from a nearest bus stop to our laboratory. Figure 6 shows an example of overlaid annotation during the experiment. Figure 6 (a) is an overlay at the main gate of our campus. Figure 6 (b) is an example of annotation given at the entrance hall of our building. Figure 6 (c) is an

IRDA Markers IRDA Detector Figure 9. Allocation of multiple IRDA markers. Figure 7. Prototype of IRDA Marker. Notebook PC IrDA Port CCD Cam era HMD IR Marker IR Marker IR Marker Figure 8. An Example of a System using IRDA Marker. example of annotation given at the hallway of the third floor guiding a user to Yokoya laboratory. As is observed from Figure 6, the composition which is overlaid annotation on real scene is realized. Through the experiment, what we have found is that a rough position estimation using RFID tag is still feasible for using navigation guidance purpose. This is mainly because rough direction is sufficient to show a route to a user. As we can also see from the experimental results, the annotation can be composed at the correct position on the image according to movement of user s viewpoint. 4. IRDA marker based position detection In this section, we propose an IRDA marker which can be used to expand utility of the system we have described in previous sections. When RFID tags are used to locate userfs position, userfs action to reach for a tag explicitly is required In other words, if a tag is not located in a prox- imity of a user, the system can not determine where a user is located in the environment. In order to compensate this weakness of the system, we have developed an infrared beacon using IRDA protocol (IRDA marker). An IRDA marker actively emits its ID information using IRDA test frame packet. Therefore, it is possible to read an ID of IRDA tag using a photo detector and using IRDA decoding software. By enhancing infrared light emitting power, ID of a tag can be received at the point where is several meters away form the IRDA marker. When multiple tags are located at a userfs proximity, the system can be designed to receive ID of a tag in specific direction by using directional photo detector. (Typically photo detector used for IRDA communication is unidirectional with 30 degrees to 60 degrees of field of view. FOV can be adjusted by using external optical parts such as lens etc.) We have built prototype of IRDA tag with SX micro controller (Ubicomp inc.). Figure 7 shows a photograph of the prototype. Figure 8 shows an example if IRDA Marker usage. IRDA detector is place on top of the HMD and userfs approcimate location is estimated from ID of the maker. Merits of using IRDA markers are; 1. Even in a case that a user does not knows the potion of the maker, userfs position can be estimated as long as IRDA beacon reaches the user, 2. By placing multiple markers as shown in Figure 9, the system may estimate approximate orientation or direction of a user; and 3. An ordinary hand held PC or PDA can also be used to download information from IRDA markers. It may be possible to build navigation guide system compatible with different kinds of mobile platform. On the other hand, demerits of using IRDA markers are; 1. Under the strong sunlight, it is difficult to read ID from IRDA markers;

2. Electric power source is required, continues transmission of RFID packet is required. This makes difficult to reduce power consumption Now, we are developing registration and position localization algorithm using IRDA markers. Possible expansions of usage of IRDA markers are, 1. Stereo camera may be used to measure distance to the IRDA markers; 2. If more than three markers are seen from stereo camera, more accurate user position can be estimated. [7] C. Harris and M. Steohens: A Combined Corner and Edge Detector, Proc. Alvey Vision Conf., pp. 147-151, 1988. 5. Conclusion This paper describes a augmented reality system that can overlay annotation at appointed point of the real scene with a wearable computer to give a user information relative to the real world. We also have developed a prototype wearable augmented reality system using RFID tag and reader to get a user s position in the real world. The prototype system can produce annotation overlay images at a campus of our school. It is shown that the wearable augmented reality system with RFID tag is feasible and has the possibility building up actual application. In the future work, we should investigate methodology for annotation overlay at arbitrary position of a real world using a global positioning system (GPS). References [1] R. Azuma: A Survey Of Augmented Reality, Presence, Vol. 6, No. 4, pp. 355-385, 1997. [2] S. Julier, M. Lanzagorta, Y. Baillot, L. Rosenblum, S. Feiner, T. Höller, and S. Sestito: Information Filtering for Mobile Augmented Reality, Proc. Int. Symp. on Augmented Reality, pp. 3-11, 2000. [3] M. Billinghurst, S. Weghorst, and T. Furness III : Wearable Computers for Three Dimensional CSCW, Proc. Int. Symp. on Wearable Computers, 1997. [4] S. Feiner, B. MacIntyre, T. Höller, A. Webster : A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment, Proc. Int. Symp. on Wearable Computers, pp 74-81, 1997. [5] K. Satoh, K. Hara, M. Anabuki, H.Yamamoto, and H.Tamura : TOWNWEAR: An Outdoor wearable MR system with high-precision registration, Proc. Int. Symp. on Mixed Reality, pp 210-211, 2001. [6] K. Satoh, M. Anabuki, H.Yamamoto, and H.Tamura : A Hybrid Registration Method for Outdoor Augmented Reality, Proc. Int. Symp. on Augmented Reality, 2001.