Ubiquitous Home Simulation Using Augmented Reality

Similar documents
The Mixed Reality Book: A New Multimedia Reading Experience

Tangible interaction : A new approach to customer participatory design

A CYBER PHYSICAL SYSTEMS APPROACH FOR ROBOTIC SYSTEMS DESIGN

ISCW 2001 Tutorial. An Introduction to Augmented Reality

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Design and Implementation of a Service Robot System based on Ubiquitous Sensor Networks

Augmented Reality Tools for Integrative Science and Arts STEAM Education

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Constructing the Ubiquitous Intelligence Model based on Frame and High-Level Petri Nets for Elder Healthcare

HELPING THE DESIGN OF MIXED SYSTEMS

OSGi-Based Context-Aware Middleware for Building Intelligent Services in a Smart Home Environment

A Study of Optimal Spatial Partition Size and Field of View in Massively Multiplayer Online Game Server

Building a Machining Knowledge Base for Intelligent Machine Tools

Industry 4.0: the new challenge for the Italian textile machinery industry

Framework Programme 7

Methodology for Agent-Oriented Software

UMI3D Unified Model for Interaction in 3D. White Paper

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Digitalisation as day-to-day-business

STE Standards and Architecture Framework TCM ITE

User interface for remote control robot

Augmented Reality Lecture notes 01 1

Abstract. Keywords: virtual worlds; robots; robotics; standards; communication and interaction.

CARL: A Language for Modelling Contextual Augmented Reality Environments

Mission-focused Interaction and Visualization for Cyber-Awareness!

AUGMENTED REALITY: PRINCIPLES AND PRACTICE (USABILITY) BY DIETER SCHMALSTIEG, TOBIAS HOLLERER

The User Activity Reasoning Model Based on Context-Awareness in a Virtual Living Space

Mixed Reality technology applied research on railway sector

Pervasive Services Engineering for SOAs

Ontology-based Context Aware for Ubiquitous Home Care for Elderly People

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Designing the Smart Foot Mat and Its Applications: as a User Identification Sensor for Smart Home Scenarios

Computer-Augmented Environments: Back to the Real World

synchrolight: Three-dimensional Pointing System for Remote Video Communication

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Physical Interaction and Multi-Aspect Representation for Information Intensive Environments

Augmented Reality And Ubiquitous Computing using HCI

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

Distributed Robotics: Building an environment for digital cooperation. Artificial Intelligence series

AR Tamagotchi : Animate Everything Around Us

Defining Smart Space in the Context of Ubiquitous Computing

VIP-Emulator: To Design Interactive Architecture for adaptive mixed Reality Space

Design and Development of a Social Robot Framework for Providing an Intelligent Service

Available online at ScienceDirect. Procedia Computer Science 56 (2015 )

VIRTUAL REALITY AND SIMULATION (2B)

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

MSc(CompSc) List of courses offered in

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Open Archive TOULOUSE Archive Ouverte (OATAO)

Components for virtual environments Michael Haller, Roland Holm, Markus Priglinger, Jens Volkert, and Roland Wagner Johannes Kepler University of Linz

An Application Framework for a Situation-aware System Support for Smart Spaces

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

Looking ahead : Technology trends driving business innovation.

Collaborative Virtual Environments Based on Real Work Spaces

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Definitions of Ambient Intelligence

Service Cooperation and Co-creative Intelligence Cycle Based on Mixed-Reality Technology

INTERIOUR DESIGN USING AUGMENTED REALITY

10/18/2010. Focus. Information technology landscape

This list supersedes the one published in the November 2002 issue of CR.

Emerging technology. Presentation by Dr Sudheer Singh Parwana 17th January 2019

Augmented and mixed reality (AR & MR)

Smart Spaces in Ubiquitous Computing

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

The Study on the Architecture of Public knowledge Service Platform Based on Collaborative Innovation

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Challenging the Future with Ubiquitous Distributed Control

Application of 3D Terrain Representation System for Highway Landscape Design

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

InSciTe Adaptive: Intelligent Technology Analysis Service Considering User Intention

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

A Profile-based Trust Management Scheme for Ubiquitous Healthcare Environment

N.B. When citing this work, cite the original published paper.

The Control of Avatar Motion Using Hand Gesture

ScienceDirect. Cyber Physical Systems oriented Robot Development Platform

Augmented reality approach for mobile multi robotic system development and integration

Combining Schematic and Augmented Reality Representations in a Remote Spatial Assistance System

Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment

CHAPTER 1: INTRODUCTION. Multiagent Systems mjw/pubs/imas/

Industrial Use of Mixed Reality in VRVis Projects

DESIGN COLLABORATION FOR INTELLIGENT CONSTRUCTION MANAGEMENT IN MOBILIE AUGMENTED REALITY

Towards affordance based human-system interaction based on cyber-physical systems

Learning Based Interface Modeling using Augmented Reality

COLOR CORRECTION METHOD USING GRAY GRADIENT BAR FOR MULTI-VIEW CAMERA SYSTEM. Jae-Il Jung and Yo-Sung Ho

A Mixed Reality Approach to HumanRobot Interaction

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control

Interior Design using Augmented Reality Environment

Development of the A-STEAM Type Technological Models with Creative and Characteristic Contents for Infants Based on Smart Devices

제 1 HCI Korea, 증강현실전시기술의적용사례및분석. Woontack Woo ( 우운택 ), Ph.D. KAIST GSCT UVR Lab. Tw

Interactive Visualizations for Cyber-

Augmented Reality- Effective Assistance for Interior Design

A Survey of Mobile Augmentation for Mobile Augmented Reality System

Performative Gestures for Mobile Augmented Reality Interactio

Accessibility on the Library Horizon. The NMC Horizon Report > 2017 Library Edition

An Ultrasonic Sensor Based Low-Power Acoustic Modem for Underwater Communication in Underwater Wireless Sensor Networks

Norbert A. Streitz. Smart Future Initiative

Designing Semantic Virtual Reality Applications

Transcription:

Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 112 Ubiquitous Home Simulation Using Augmented Reality JAE YEOL LEE*, GUE WON RHEE, DONG WOO SEO, and NAM KI KIM Department of Industrial Engineering Chonnam National University 300 Yongbong-Dong, Buk-gu, 500-757, Gwangju SOUTH KOREA Abstract: - Computing paradigm is moving toward ubiquitous computing in which devices, software agents, and services are all expected to seamlessly integrate and cooperate in support of human objectives. Augmented reality (AR) can naturally complement ubiquitous computing by providing an intuitive and collaborative interface to a three-dimensional information space embedded within physical reality. This paper presents a context-aware immersive framework for simulating ubiquitous home using augmented reality, which can simulate a rich set of ubiquitous services in a mixed ubiquitous home. Key-Words: - augmented reality, context awareness, mixed reality, ubiquitous computing, ubiquitous home simulation 1 Introduction Ubiquitous computing is a vision of our future computing lifestyle in which computer systems seamlessly integrate into our everyday lives, providing services and information in anywhere and anytime fashion [1,7,10]. -aware and ubiquitous systems are computer systems that can provide relevant services and information to users by exploiting contexts. By contexts, we mean information about locations, software agents, users, devices, and their relationships [3]. Augmented reality (AR), another type of virtual reality, is considered as an excellent user interface for ubiquitous computing applications, because it allows intuitive information browsing of location-referenced information [9]. In an AR environment, the user s perception of the real world is enhanced by computer-generated entities such as 3D virtual objects. Interaction with these entities occurs in real-time providing convincing feedback to the user and giving the impression of natural interactionthus, AR is considered naturally complementing ubiquitous computing by providing an intuitive and collaborative interface to a three-dimensional information space embedded within physical reality. Correspondingly, the human-computer interfaces and interaction metaphors originating from AR research have proven advantageous in a variety of real-world ubiquitous application scenarios such as simulation of virtual manufacturing [4], AR-enabled 3D collaboration [2], convergence of context awareness with augmented reality [8], and mobile augmented reality [11]. Although context-aware and ubiquitous computing is very popular in various research areas, a more sophisticated research is still needed, which should combine context-aware computing with more natural and intuitive interfaces like augmented reality for simulating ubiquitous services and supporting immersive interactions. Note that the need for such requirements is increasing rapidly so that a neutral framework or middleware should also be provided to support visualization and simulation of various ubiquitous applications. This paper presents a context-aware immersive framework using augmented reality for supporting the simulation of various services and immersive interactions in ubiquitous home. The framework offers a software framework to acquire, interpret and disseminate context information. Further, it utilizes augmented reality for simulating relevant and immersive interactions in ubiquitous home design and evaluation by embedding virtual models onto physical models considering contexts. Moreover, human interactions based on AR not only feedback to existing contexts, but also generate new contexts, which can realize bi-augmentation between physical and virtual spaces. The remainder of the paper is organized as follows. Section 2 overviews the proposed system. Section 3 presents how to maintain contexts and apply them to augmented reality in ubiquitous home

Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 113 environments. Finally, Section 4 concludes with some remarks. 2 System Overview The primary objective of this research is to propose a generic framework that supports the simulation of ubiquitous home in mixed reality-based environments as shown in Fig. 1. The framework has been built on the three layers: 1) U-interface layer, 2) U-context layer, and 3) AR interaction layer. The U-context layer maintains contexts from various resources such as devices, people, environment, etc. Further, the U-context broker facilitates reasoning and execution of those contexts. The U-context layer is based on CAMUS (-Aware Middleware for URC System) which is a middleware for supporting the context-awareness of ubiquitous services such as devices, sensors, and sobots (software robots) [6]. The U-interface layer supports bi-interactions between physical devices (or software components) and the U-context layer. Thus, all the devices can be easily registered, searched, and executed by the CAMUS-enabled broker. The AR interaction layer provides more realistic and human-oriented services using an AR technique. It is linked to the U-interface and U-context layers for context acquisition and reasoning, and graphical information gathering and synchronization. Thus, the three layered framework can support various kinds of ubiquitous services and interactions such as context-aware adaptation to the environment and human-centered AR-enabled interactions and simulations. U-Interface U-Interface Layer Layer U- U- Layer Layer AR AR Interaction Interaction Layer Layer Devices Locations CAMUS-enabled Broker Sobots U- Acquisition, Maintenance Modeling Modules in Buildtime Execution Modules in Runtime U- Interactions For Immersion AR-based Interaction Broker Sensors U- Knowledge Base Persons CAMUS: -Aware Middleware for URC System Fig. 1 CAMUS-enabled ubiquitous service framework 3 -Aware Simulation Using Augmented Reality This section explains how contexts are managed and reasoned to provide more relevant and ubiquitous services. It also discusses how to utilize augmented reality for executing context-aware interactions in ubiquitous home. 3.1 CAMUS-based Management The CAMUS-based middleware consists of two parts as shown in Fig. 2: 1) build-time components for ubiquitous space modeling and 2) runtime components for task execution [6]. The build-time components are used for registering and managing physical sensors, ubiquitous services, environments, users, and tasks. For example, the sensor modeler offers means for mapping sensors of the physical space into sensor services of the cyber space,

Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 114 extracting context information from the sensors and supplying the information to the task engine. The task modeler supports the modeling of context information specific to a task and the description of the rules necessary to perform the task. Then, the built tasks are executed by the runtime components. Among the runtime components, the task manager plays the main role in executing tasks. It initiates individual tasks and manages on-going task processes. The task engine executes the actual tasks considering the situation. It has an inference engine to process facts and rules supplied by a task. In CAMUS, we applied JESS as an inference engine [5]. Modeling in Build-time Execution in Runtime Robot Actuator Robot Robot Interface Interface Actuator Interface Service Modeler Service Service Invocator Task Task Manager Task Task Modeler Task Task Rule Task Task Engine Engine Modeler Environment Modeler Env. Environment Manager Event Event Notification System Manager Sensor Interpreter Sensor Modeler Sensor Graph Interface Sensor Interface Sensor Fig. 2 Build-time and runtime components of CAMUS 3.2 AR-based Simulation The AR-based interaction broker consists of 4 major modules as shown in Fig. 3: U-context binding module, U-interface binding module, tracking module, and rendering module [8]. Internally, the tracking module and rendering module support AR applications. The tracking module is based on a marker-based tracking technique, also supporting multi-marker tracking capabilities. In this research, ARToolkit has been utilized [2]. The rendering module embeds the 3D virtual reality of service and context information onto the physical reality image synchronized by the tracking module. Externally, the U- binding module and U-Interface binding module are used to communicate with the U- layer and U-Interface layer for context and service information retrieval and synchronization. The U-Interface binding module receives virtual models from the U-Interface broker, then, applies various interactions, and finally feedbacks the interactions to the U-Interface broker, which can modify the original model or generate new models. Similarly, the U- binding module gets context information from the U- layer and then embeds the contexts to AR. Further, it also feedbacks new contexts generated from AR interactions to the U- layer. Moreover, the U- broker queries and reasons about contexts, and sends the derived contexts to the U- interface module, which again applies them to AR.

Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 115 Fig. 3 Modules of the AR interaction layer Fig. 4 shows how the proposed approach can be applied to visualize and simulate a ubiquitous home. Typically, in many research works, a ubiquitous home has been built, equipped with many ubiquitous devices, and tested by applying a variety of context aware software components to them. However, we can expect that, in a preliminary research, it would be quite difficult to modify and simulate dynamic objects and devices in the real environment, which limits realistic context-aware experiments. Further, re-building and modification of the home is cost expensive. On the other hand, by combining the AR technique with the real environment, we realized that AR could be very effectively utilized by dynamically embedding virtual models into the physical environment, which can simulate real environments, although all kinds of ubiquitous devices are not equipped with. Further, modification of the environment by introducing new dynamic objects and their interactions is easy, which is very effective in the early stage of testing and proving. To experiment this and verify the effectiveness, we constructed a miniaturized ubiquitous home as shown in Fig. 4. Considering the implemented results, we realized that ubiquitous environments can be much more realistic, interactive, and immersive if the AR technique can be fully utilized. (a) (b) (c) Fig. 4. Simulation of ubiquitous services in ubiquitous home 5 Conclusion This paper has presented how to simulate and visualize ubiquitous home in mixed reality-based environments by supporting the convergence of context-awareness and augmented reality. The framework provides a common data model for different types of context information from external sensors, applications and users in the environment. It also offers the software framework to acquire, interpret and disseminate context information. Further, it utilizes augmented reality for simulating virtual ubiquitous home and immersive interactions by embedding virtual models onto physical models considering contexts, which can realize bi-augmentation between physical and virtual spaces. In conclusion, the convergence can be very effectively utilized for: 1) seamless interaction between real and virtual environments, 2) providing context-awareness, 3) presenting spatial cues for various kinds of interactions such as ubiquitous home and collaboration, and 4) providing the ability to transit smoothly between reality and virtuality.

Proceedings of the 2007 WSEAS International Conference on Computer Engineering and Applications, Gold Coast, Australia, January 17-19, 2007 116 Acknowledgment: This research was supported in part by the 2 nd Brain Korea 21 (BK21) Program. References: [1] Anhalt, J., Smailagic, A., Siewiorek, D.P., Gemperle, F., Salber, D., Weber, S., Beck, J., Jennings, J., Toward context-aware computing: experiences and lessons, IEEE Intelligent Systems Vol. 16, No. 3, 2001, pp. 38-46. [2] Billinghurst, M. Kato, H., Collaborative augmented reality, Communications of the ACM Vol. 45, No. 7, 2002, pp. 64-70. [3] Chen, H., Finin, T., Joshi, A., Kagal, L., Perich, F., Chakraborty, D., Intelligent agents meet the semantic web in smart spaces, IEEE Internet Computing, Vol. 8, No. 6, 2004, pp. 69-79. [4] Doil, F., Schreiber, W., Alt, T., Patron, C., Augmented reality for manufacturing planning, Proc. of the workshop on Virtual Environments 2003, 2003, pp. 71-76. [5] Freeman-Hill, E., Jess Manual, Sandia National Laboratories, Livermore, CA, USA, 1997 [6] Kim, H., Cho, Y.-J., Oh, S.-R., A middleware supporting context-aware services for network-based robots, IEEE Workshop on Advanced Robotic and its Social Impacts, Nagoya, Japan, 2005 [7] Kindberg, T., et al., People, places, things: web presence for the real world, Mobile Networks and Applications, Kluwer Academic Publishers, 2002, pp. 365-376. [8] Lee, J.Y., Seo, D.W., A context-aware and augmented reality-supported service framework in ubiquitous environments, EUC 2006 LNCS, Vol. 3823, 2005, pp. 258-267. [9] Schmalstieg, D. and Reitmayr, G., The world as a user interface: augmented reality for ubiquitous computing, Proc. Central European Multimedia and Virtual Reality Conf., 2005 [10] Suzuki, G. et al., u-photo: Interacting with pervasive services using digital still images, Pervasive 2005 LNCS Vol. 3468, 2005, pp. 190-207. [11] Wagner, D., Pintaric, T., Ledermann, F., Schmalstieg, D., Towards massively multi-user augmented reality on handheld devices, Pervasive 2005 LNCS, Vol. 3468, 2005, pp. 208-219. [12] Wang, X., Dong, J.S., Chin, C.Y., Semantic space: an infrastructure for smart spaces, IEEE Pervasive Computing, Vol. 3, No. 3, 2004, pp. 32-39.