SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Similar documents
R (2) Controlling System Application with hands by identifying movements through Camera

Advancements in Gesture Recognition Technology

HARDWARE SETUP GUIDE. 1 P age

Building a bimanual gesture based 3D user interface for Blender

The University of Algarve Informatics Laboratory

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Virtual Grasping Using a Data Glove

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Interactive and Immersive 3D Visualization for ATC. Matt Cooper Norrköping Visualization and Interaction Studio University of Linköping, Sweden

User Interface Software Projects

Interface Design V: Beyond the Desktop

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

HUMAN COMPUTER INTERFACE

Team Breaking Bat Architecture Design Specification. Virtual Slugger

A Hybrid Immersive / Non-Immersive

Haptics in Military Applications. Lauri Immonen

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

EMPOWERING THE CONNECTED FIELD FORCE WORKER WITH ADVANCED ANALYTICS MATTHEW SHORT ACCENTURE LABS

Vocational Training with Combined Real/Virtual Environments

UNIT VI. Current approaches to programming are classified as into two major categories:

PRORADAR X1PRO USER MANUAL

HUMAN MOVEMENT INSTRUCTION SYSTEM THAT UTILIZES AVATAR OVERLAYS USING STEREOSCOPIC IMAGES

Designing in the context of an assembly

Attorney Docket No Date: 25 April 2008

BodyViz fact sheet. BodyViz 2321 North Loop Drive, Suite 110 Ames, IA x555 www. bodyviz.com

Direct gaze based environmental controls

Geographic information systems and virtual reality Ivan Trenchev, Leonid Kirilov

The VR Factory: Discrete Event Simulation Implemented in a Virtual Environment

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Human Computer Interaction (HCI, HCC)

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

Chapter 1 - Introduction

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

virtual reality SANJAY SINGH B.TECH (EC)

Intelligent interaction

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

COVIRDS: A VIRTUAL REALITY BASED ENVIRONMENT FOR INTERACTIVE SHAPE MODELING

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

Microsoft Scrolling Strip Prototype: Technical Description

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

Randomized Motion Planning for Groups of Nonholonomic Robots

Experience of Immersive Virtual World Using Cellular Phone Interface

CS 354R: Computer Game Technology

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications. Gate Review

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Gesture Recognition with Real World Environment using Kinect: A Review

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

The use of gestures in computer aided design

Affordance based Human Motion Synthesizing System

StereoSTATIKA. Main Features:

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Virtual Reality Calendar Tour Guide

A SEMINAR REPORT ON BRAIN CONTROLLED CAR USING ARTIFICIAL INTELLIGENCE

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Effective Iconography....convey ideas without words; attract attention...

CS 315 Intro to Human Computer Interaction (HCI)

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

Physical Presence in Virtual Worlds using PhysX

Reviews of Virtual Reality and Computer World

Robot Task-Level Programming Language and Simulation

VIRTUAL REALITY TECHNOLOGIES AND VIRTUAL MANUFACTURING

Using VR and simulation to enable agile processes for safety-critical environments

VLA Experimental Resource for Testing Innovative Configurations and Lightings (VERTICAL)

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

Realtime 3D Computer Graphics Virtual Reality

Shared Virtual Environments for Telerehabilitation

Stream Design: From GEOPAK to HEC-Ras

By: Celine, Yan Ran, Yuolmae. Image from oss

History of Virtual Reality. Trends & Milestones

Sikorsky S-70i BLACK HAWK Training

A C A D / C A M. Virtual Reality/Augmented Reality. December 10, Sung-Hoon Ahn

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

The Use of Virtual Reality System for Education in Rural Areas

TOSHIBA MACHINE CO., LTD.

Post-Installation Checkout All GRT EFIS Models

Table of Contents. Stanford University, p3 UC-Boulder, p7 NEOFELT, p8 HCPU, p9 Sussex House, p43

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

MANPADS VIRTUAL REALITY SIMULATOR

Training Schedule. Robotic System Design using Arduino Platform

VR/AR Concepts in Architecture And Available Tools

Virtual Environments. Ruth Aylett

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE

ISCW 2001 Tutorial. An Introduction to Augmented Reality

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Development of excavator training simulator using leap motion controller

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

House Design Tutorial

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE

Medical Robotics. Part II: SURGICAL ROBOTICS

Teciam. Part C. Virtual Mechatronics

Faculty of Electrical & Electronics Engineering BEE4233 Antenna and Propagation. LAB 1: Introduction to Antenna Measurement

Transcription:

Proceedings of the 1998 Winter Simulation Conference D.J. Medeiros, E.F. Watson, J.S. Carson and M.S. Manivannan, eds. SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING Hank Grant School of Industrial Engineering The University of Oklahoma Norman, OK 73019-1016, U.S.A. Chuen-Ki Lai The SABRE Group 4200 Buckingham Rd Fort Worth TX 76155, U.S.A. ABSTRACT Simulation Modeling with Artificial Reality Technology (SMART) is a simulation modeling tool that provides a virtual reality interface for building graphical simulation models. The simulation models, comprised of nodes and arcs, are constructed in three dimensions. As the user builds a model, he may immerse himself in it using virtual reality hardware and software tools and take advantage of the three dimensional environment provided by SMART. Models built using SMART can be exported to AweSim, and simulated as SLAM models (Pristker et al, 1996). The virtual reality hardware includes an electronic glove and head-mounted display. The specific hardware is the 5DT Glove (Fifth Dimension Technologies, 1996) and VIO I- Glasses (Virtual I-O, 1995), respectively. By wearing the 5DT Glove, the user can navigate through virtual space and manipulate a three dimensional simulation model. The VIO I-Glasses allows the user to experience immersion in the virtual modeling world. The head-mounted device constantly responds to the motions of the user s head to reflect changes in his view of the virtual world. With SMART, simulation practitioners are no longer restricted to building simulation models on a flat two dimensional space. Rather, they can now build three dimensional simulation models with high non-planar complexity through the virtual reality interface provided by SMART. 1 INTRODUCTION Virtual reality has been explored for several years. One of its initial pioneers was Jaron Lanier who developed many of its basic concepts (Porter, 1992). Due to the decreasing cost and increasing power of computers, virtual reality is being implemented by many industries such as aviation (Longhurst, 1995), medical (Hollands et al, 1996), and manufacturing (Expert System, 1995) to address key areas like training, design, and testing. The three dimensional visualization capability of virtual reality is the primary reason why this advanced technology is becoming the interface of the future for computing. The software system, SMART has been developed to explore the use of virtual reality in building simulation models. SMART also serves as a prototype for testing the feasibility of creating a virtual reality simulation modeling software system on a relatively low-cost personal computer (PC). The simulation models built by SMART can be exported and then simulated by AweSim as SLAM models to analyze systems. AweSim is a general purpose simulation tool developed by Pristker Corporation (Pritsker, 1996). AweSim and SLAM were chosen for this research but the concepts of SMART may be implemented to build any network based simulation model. The minimum system configuration for SMART is a Pentium 200 MHz processor with 32 Mbytes of RAM and 8 Mbytes of video memory. 2 VIRTUAL REALITY HARDWARE SMART offers a three dimensional interface using virtual reality hardware which includes an electronic glove and head-mounted display. Manufactured by the Fifth Dimension Technologies, the electronic glove is known as the 5DT Glove. The head-mounted display is called the VIO I-Glasses and it is manufactured by Virtual I-O. 2.1 DT Glove The 5DT Glove is used by SMART as the primary manual input device. The electronic glove is plugged into a PC s serial port. The configuration parameters of the 5DT Glove such as the bending angle of each finger and, pitch and roll of the wrist are constantly sampled by the PC s serial connection and sent to SMART for processing. The user controls SMART using a set of gestures to cause 437

Grant and Lai actions to be taken when building simulation models. For example, the grasp gesture will cause a SLAM node to be selected for editing. When a recognizable gesture is detected, SMART responds with the appropriate action and provides audio feedback confirming the action. The glove s configuration and its relative position in the virtual world are continuously animated by a robot-like hand (Figure 1). Every motion of the user s fingers is reflected by the animated hand in the virtual world. ±360 respectively. The user is allowed to pitch and yaw simultaneously. Figure 2: VIO I-Glasses Pitching (Virtual I-O, 1995). Figure 1: Animated Hand for 5DT Glove. The basic actions controlled by the 5DT Glove are: Navigating (flying) through the virtual world; Grasping and repositioning simulation nodes; Adding text and simulation nodes to the virtual world model; Editing existing text and simulation nodes; Connecting simulation nodes with activities; and Transferring from one virtual world to another instantaneously. 2.2 VIO I-Glasses In addition to the 5DT Glove, SMART also uses the VIO I- Glasses as another virtual reality hardware interface especially for simulation model immersion enhancement. The device is plugged into a PC s serial port. The VIO I- Glasses are designed to give the user the impression that he or she is physically present in the virtual modeling world. This is accomplished by providing a virtual view using the VIO I-Glasses which reacts directly to two primary head motions: pitching (Figure 2) and yawing (Figure 3). Pitching is equivalent to nodding the head up and down and yawing is swinging the head left and right. In SMART, the ranges for pitching and yawing are ±60 and Figure 3: VIO I-Glasses Yawing (Virtual I-O, 1995). To use the I-Glasses to build simulation models, the user simply needs to put on the VIO I-Glasses and look around the way he usually does in real world. The current pitching and yawing angles are continuously sampled and the this data is used to compute the orientation of the virtual view. As the orientation of the user changes, the virtual world is rendered accordingly. 3 MENUS SMART has five categories of menus that specify its actions. Most of these commands can also be executed using specific gestures on the 5DT Glove. The menus are File, Edit, View, Glove, and Help. 3.1 File The File menu allows the user to save and open the three dimensional simulation models stored in SMART format. Two important commands found in the File menu are 438

Simulation Modeling with Artificial Reality Technology Export Network and Export Control. Using these commands, the user can convert and export the simulation models built using SMART into statements that can be processed and simulated by the AweSim/SLAM simulator. 3.2 Edit The Edit menu allows the user to add new three dimensional Network and Control nodes into the virtual modeling world. The user can use the commands found in the menu to edit the properties of each node such as its parameters and color. In addition, the user can use the Text command in the menu to add three dimensional text for visually documenting the three dimensional simulation models. 3.3 View Using the View menu, the user can show or hide the status bar of the window. Also, the user can instantaneously switch between the two virtual worlds provided in SMART. One world is used to build the network simulation models, using three dimensional versions of SLAM nodes. The other world is used to build the simulation control model, where general model parameters are specified. 3.4 Glove The Glove menu consists of the single command, Modify. The user can use the command to bring up a dialog that provides four options concerning the 5DT Glove: Loading an existing user setting; Defining and calibrating a new user setting; Viewing/Changing the current user setting; and Deleting an existing user setting. Because the dimensions of every individual s hand is different, it is necessary for the user to calibrate the settings of the 5DT Glove such that every user receives the most accurate responses from the electronic glove. Users can save these setting and recall them later. 3.5 Help The Help menu lets the user read the general information regarding SMART. 4 EXAMPLE APPLICATIONS To illustrate the use of SMART, a product distribution center is modeled. It has two levels: top and bottom. Items arrive at the top level of the distribution center every 10 minutes. Both the top and bottom level consist of a waiting area and three work stations. In either level, the work stations are laid out in a triangle with the waiting area located in the middle. The function of the waiting areas is to store the incoming items when the work stations are busy. Basically, the items flow from the top level to the bottom level before exiting the distribution center (Figure 4) In Top Level Bottom Level Out Figure 4: System Schematic of Distribution Center Model. Because of the nature of the distribution facility, the items can either circulate around the top level or move on to the lower level after they have been serviced by the work stations. A three dimensional simulation model was built using SMART in order to illustrate the physical three dimensional flow of the model. The model is shown in Figure 5, Figure 6, and Figure 7. Items enter at the top of the model, through the CREATE node (Figure 6). They then progress to the first layer and may wait at the QUEUE node, or circulate to various processing stations. After processing in level 1, they move to the lower level for similar processing (Figure 7). Finally, they exit the system through the TERMINATE node at the bottom of the model. 439

Grant and Lai Figure 8: Distribution Center Model - 2D View. Figure 5: Distribution Center Model - SMART View 1. Using three dimensional capabilities, SMART visually presents the physical flow of the simulation model. The work stations are modeled in a triangular form and joined in the middle where the waiting area is located. Furthermore, the top and bottom level of the distribution center are clearly distinguished by the three dimensional model in SMART. It is difficult to show the actual physical flow of the distribution center using the two dimensional simulation model. The user can apply the capabilities of SMART to build these models and interact with them. He can fly through the three dimensional model as if it actually exits. He can select various symbol to edit and construct the model to reflect the construction of the physical system or address organization and presentation issues concerned with the layout of the model. 5 CONCLUSIONS Figure 6: Distribution Center Model - SMART View 2. Several objectives have been accomplished in developing SMART. First of all, SMART allows the user to use a virtual reality interface to build complex simulation models. But more importantly, the user can interact with the models using advanced virtual reality hardware such as the 5DT Glove and VIO I-Glasses. Secondly, SMART assures the feasibility of using virtual reality to build simulation models on a low-cost PC. The software tools used to develop SMART have significantly optimized the graphical rendering process which could only be accomplished on a powerful, expensive graphics workstation a few years ago. 6 FURTHER RESEARCH Figure 7: Distribution Center Model - SMART View 3. The standard two dimensional model, shown in Figure 8, is provided for comparison purposes. The non-planar nature of this network model makes it more difficult to comprehend in two dimensions. Several ideas for further research have been developed through the creation of SMART. First, stereoscopic view can be added as an option to the VIO I-Glasses in order to make the three dimensional symbols more realistic. Second, it is suggested that an addition left glove should be included as part of the hardware interface to enhance usability and provide more interaction options. These could include virtual keyboards and two handed gestures. 440

Simulation Modeling with Artificial Reality Technology Third, the input data entry procedure using the keyboard in SMART should be replaced by other more efficient methods such as voice recognition input.. Even though the user can still see the keyboard while wearing the VIO I-Glasses, he or she is required to turn the head back to the keyboard s location for typing input data. With voice input, the user can enter input data as he or she speaks. Next, the capabilities of SMART should be extended to provide three dimensional animation. Tools for building an animation driven by a simulation model in a virtual reality modeling world should be included in addition to tools for interacting with an executing simulation model. Finally, the orientation and configuration of the data glove should be graphically illustrated during the calibration of each gesture. The graphical illustrations should include a picture of the appropriate gesture as well as an animation of the gesture that the user is making to assure that it is accurate. REFERENCES Expert System. 1995. Virtual reality application. Vol. 12, No. 2, pp. 174-175. Fifth Dimension Technologies. 1996. 5DT Glove - User s Manual, Pretoria. Hollands, R.J. and E.A. Trowbridge. 1996. A PC-based virtual reality arthroscopic surgical trainer, Simulation in Synthetic Environments 1996, The Society for Computer Simulation, San Diego, Vol. 28, No. 2, pp. 17-22. Longhurst, C. 1995. Event-Driven Visual Effects in Flight Simulation Virtual Environments, Virtual Reality Applications, Academic Press, San Diego, pp. 231-244. Porter, S. 1992. Interview: Jaron Lanier, Computer Graphics World, Vol. 14, No. 4, pp. 61-70. Pritsker, A.A.B. and J.J. O Reilly. 1996. AWESIM: The integrated simulation system. In Proceedings of the 1996 Winter Simulation Conference, ed. J.M. Charnes, D.M. Morrice, D.T. Brunner, and J.J. Swain, 8-11. Piscataway, New Jersey: Institute of Electrical and Electronics Engineers. Virtual I-O. 1995. VIO - I-Glasses - User s Manual, Seattle. joining the University of Oklahoma, Dr. Grant was with the National Science Foundation in Washington, D.C., where he directed programs in Production Systems, Engineering Design, and Operations Research. Before that, he was Director of the Measurement and Manufacturing Systems Laboratory with Hewlett-Packard in HP Labs. In that capacity, he was responsible for developing the five to ten year vision of HP s manufacturing requirements as well as a new instrument system architecture for the company. Before joining HP, Dr. Grant was involved in the startup and development of two Industrial Engineering Software businesses, Pritsker Corporation and FACTROL where he was actively involved in simulation language design and development. CHUEN-KI LAI received his Bachelor's (Special Distinction) and Master's degree in Industrial Engineering from the University of Oklahoma in 1996 and 1998, respectively. He is currently working as a consultant at the SABRE Group where he is responsible of providing information technology solutions for the travel and transportation industry. AUTHOR BIOGRAPHIES HANK GRANT joined the faculty at the University of Oklahoma in December of 1993 as Director of the School of Industrial Engineering and Southwestern Bell Professor. Additionally, he is the founder of the Center for the Study of Wireless Electromagnetic Compatibility. Prior to 441