DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS

Similar documents
Augmented and Virtual Reality

SVEn. Shared Virtual Environment. Tobias Manroth, Nils Pospischil, Philipp Schoemacker, Arnulph Fuhrmann. Cologne University of Applied Sciences

PRODUCTS DOSSIER. / DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

DEVELOPMENT KIT - VERSION NOVEMBER Product information PAGE 1

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Virtual Reality as Innovative Approach to the Interior Designing

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

Digitalisation as day-to-day-business

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Air Marshalling with the Kinect

Realtime 3D Computer Graphics Virtual Reality

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Head Tracking for Google Cardboard by Simond Lee

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

The Use of Virtual Reality System for Education in Rural Areas

Input devices and interaction. Ruth Aylett

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

VR/AR Concepts in Architecture And Available Tools

Virtual Reality Game using Oculus Rift

Utilization of Virtual Reality Visualizations on Heavy Mobile Crane Planning for Modular Construction

Dexta Robotics Inc. DEXMO Development Kit 1. Introduction. Features. User Manual [V2.3] Motion capture ability. Variable force feedback

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Virtual Reality in Plant Design and Operations

What was the first gestural interface?

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

interactive laboratory

UMI3D Unified Model for Interaction in 3D. White Paper

Microsoft Scrolling Strip Prototype: Technical Description

Geo-Located Content in Virtual and Augmented Reality

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Open Access Coal and Gas Outburst Accident Virtual Escape System for Miners Based on Virtools

Virtual Reality in Neuro- Rehabilitation and Beyond

YOUR PRODUCT IN 3D. Scan and present in Virtual Reality, Augmented Reality, 3D. SCANBLUE.COM

Comparison of Head Movement Recognition Algorithms in Immersive Virtual Reality Using Educative Mobile Application

3D Interaction using Hand Motion Tracking. Srinath Sridhar Antti Oulasvirta

COMOS Walkinside 10.2

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

Physical Presence in Virtual Worlds using PhysX

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

Digitizing Color. Place Value in a Decimal Number. Place Value in a Binary Number. Chapter 11: Light, Sound, Magic: Representing Multimedia Digitally

Haptic presentation of 3D objects in virtual reality for the visually disabled

Background. Area of Concern

BE A FIELD OPERATOR IN HYSYS-BASED OTS AND OCULUS RIFT VIRTUAL REALITY

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

Construction of visualization system for scientific experiments

Classifying 3D Input Devices

Falsework & Formwork Visualisation Software

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

THE Touchless SDK released by Microsoft provides the

revolutionizing Subhead Can Be Placed Here healthcare Anders Gronstedt, Ph.D., President, Gronstedt Group September 22, 2017

5/17/2009. Digitizing Color. Place Value in a Binary Number. Place Value in a Decimal Number. Place Value in a Binary Number

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane


Virtual Environments. Ruth Aylett

Chapter 1 - Introduction

Eric Chae Phong Lai Eric Pantaleon Ajay Reddy CPE 322 Engineering Design 6 Assignment 5

Virtual Reality in E-Learning Redefining the Learning Experience

6Visionaut visualization technologies SIMPLE PROPOSAL 3D SCANNING

Virtual/Augmented Reality (VR/AR) 101

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

Step. A Big Step Forward for Virtual Reality

USTGlobal. VIRTUAL AND AUGMENTED REALITY Ideas for the Future - Retail Industry

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Virtual and Augmented Reality: Applications and Issues in a Smart City Context

Virtual Reality: a way to prepare and optimize operations in decommissioning projects

Software Requirements Specification

EXCELLENCE IN 3D MEASUREMENT

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Making Virtual Reality a Reality. Surviving the hype cycle to achieve real societal benefit.

GESTUR. Sensing & Feedback Glove for interfacing with Virtual Reality

Design and Implementation of an Intuitive Gesture Recognition System Using a Hand-held Device

Unpredictable movement performance of Virtual Reality headsets

The Control of Avatar Motion Using Hand Gesture

Fluke 570 Series Infrared Thermometers:

Electrical and Automation Engineering, Fall 2018 Spring 2019, modules and courses inside modules.

Omni-Directional Catadioptric Acquisition System

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Operating Virtual Panels with Hand Gestures in Immersive VR Games

Introduction to Virtual Reality (based on a talk by Bill Mark)

Application of 3D Terrain Representation System for Highway Landscape Design

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Interior Design with Augmented Reality

Virtual Reality Game using Oculus Rift

VR System Input & Tracking

Oculus Rift Getting Started Guide

4/23/16. Virtual Reality. Virtual reality. Virtual reality is a hot topic today. Virtual reality

FATE WEAVER. Lingbing Jiang U Final Game Pitch

The development of a virtual laboratory based on Unreal Engine 4

Transcription:

MultiScience - XXX. microcad International Multidisciplinary Scientific Conference University of Miskolc, Hungary, 21-22 April 2016, ISBN 978-963-358-113-1 DEVELOPMENT OF VIRTUAL REALITY TRAINING PLATFORM FOR POWER PLANT APPLICATIONS Róbert Beleznai, Szabolcs Szávai, Gergely Dobos 1 PhD, leading researcher, Bay Zoltán Nonprofit Ltd., Engineering Division 2 PhD, head of department, Bay Zoltán Nonprofit Ltd., Engineering Division 3 research fellow, Bay Zoltán Nonprofit Ltd., Engineering Division ABSTRACT: Development of a training platform for power plant application using virtual reality (VR) tools supports the safety operation. The VR platform is a computer-based artificial environment where the user can do activities and interact with different objects, and he/she senses the environment realistic. Such immersive, realistic VR world can be applied in the energy sector of the industry, especially in the field of maintenance, operation in normal and emergency conditions to train the staff for the adequate actions. For this, usability test of the recently available devices are performed, followed by the development of an industrial application. Using this power plant application the maintenance training can be performed in such a way that it does not disturb the operation schedule of the plant and the user is not exposed unnecessarily at danger (high temperature, radiation, etc.). Keywords: Virtual reality, Oculus Rift, LEAP Motion, Virtualizer, maintenance, Unity3D 1. INTRODUCTION The virtual reality is an artificial environment produced by means of computer systems and information technologies, in which the user, using his/her virtual identity, can do interaction in the cyberspace. A great advantage of the virtual training platform is that it can be separated from the real world in space and time and in such a way it does not disturb the work operations, and there is no need to physically build expensive training areas. The objective is to develop a virtual perception-based event simulation training platform for training people for maintenance of power plant equipment. The use of the virtual space in the design, testing and maintaining of new or existing facilities of the energy sector may enhance the safe operability, maintainability and interference with critical situations in terms of early error detection. The platform provides for the user a realistic and immersive experience. The system is interactive, so the user s action will affect the VR environment, and can be linked to real world actions. It is important that the work can be done in a safe environment without emergency situation (the user is not exposed to heat, radiation, high voltage, and other hazards of oxygen deficiency), and the system is portable. DOI: 10.26649/musci.2016.098

2. DEVELOPMENT OF VIRTUAL POWER PLANT ENVIRONMENT Development of the virtual training platform requires the design of a virtual power plant environment, which includes the digital modelling of the facilities and equipment using 3D scanning or CAD software. The polygon structure developed by the modelling software can be compiled using the physical properties of the modelled objects in Unity3D game engine software. The flexibility of the fictional space makes possible to test several elements of the system and to enhance the fine movement capability of the maintenance staff. Initially, some fundamental interactions, such as opening the door, turning on/off the lights, or controlling proportionally the speed of a fan were used to test the potential applicability of the virtual reality in industrial applications. Later, the focus was shifted to test the maintenance of a gas turbine located in the middle of the hangar (Fig.1). Figure 1 Virtual environment For maintenance task the manual work is emphasized, however, the spatial movements in space is also considered in natural way. The role of the spatial movements is very significant when the maintenance requires motion between the various places of the plant, and learn to navigate in workspace. The user can walk with his/her legs in the assembly shop or kneel next to the equipment using the Virtualizer motion detection system. Within the virtual environment it is necessary to define such a character that is able to interact with its environment. This character is the avatar of the user, whose movements are controlled by the user motion in the real world - the user movements in the real world are transformed into the movements of the avatar in the virtual space. It is essential for the immersive experience that the user should be able to control his/her avatar and do interactions with such natural movements like he/she does in the real life. To lift up a gear it is necessary to use fingers, however, if the object weight exceeds the limits of the onehanded work, the technician must intervene with both hands. Such movements are necessary to happen in the same way in the virtual space as in the reality, the realistic physical motion is essential. Solving the problem requires the combination

and joint application of more, recently available VR tools to access the best and most realistic result (Fig. 2). These tools are the following: Oculus Rift (OR) headmounted-display for visualising the 3D environment in a realistic, stereoscopic way; Cyberith Virtualizer (CV) motion control device for detecting the body movement of the user; Leap Motion (LM) non-contact optical motion sensor for detecting the fine motor movements of the user. The aim is that the movements of a specific task in the virtual space should be equivalent with the movements in the real space [1], so conditioned reflexes will not build into the activities which can be detrimental to the work of the staff during the actual execution of the task. For practising the assembly work, it is essential the precise and latency-free (realtime) motion detection. Many different devices are available in the market, however, for our application the LEAP Motion provides the best solution, as their small-sized, non-contact optical motion sensor can be fixed to anywhere and it does not disturb the free movement of the user. A further advantage of the LM is that the provided SDK (software development kit) allows large freedom for the software development in the field of image processing for the two-camera hardware. Figure 2 Combination of VR tools Walking in the virtual assembly shop The soul of the maintenance work is indisputably the fine motor movement, for which mapping using dynamic imaging variables the bi-cameras of the LEAP Motion sensor are applied. The advantage of the sensor is the disadvantage as well: as an optical sensor does not require a direct connection with the user, the noncontact design provides the freedom and flexibility of movement for the mechanic during the training, however, the sensor can detect only those movements what are visible for the camera of the LEAP Motion (obscured motions cannot be detected blind spots). As the hardware cannot be improved, to get around the problem according to the abovementioned, an algorithm for movement method recognition is developed, which infers typical schemes from the obtained data and automates the

smaller joint operations in such a way that it does not detract from the difficulty level of implementation of the task and the user does not recognize the assistance. The incomplete data from the sensors are replaced by movement schemes borrowed from the real-life in order to make the operations smoother. In case of assembly, if the magnitude of our hands' trembling is greater than the size of the bore where the bolt we want to insert, most probably the operation will be failed and the bolt will land on the ground (Fig. 3). Figure 3 Fine motor motion simulations for precise operation using virtual hands As similar problem also can happen in the virtual reality due to the hardware deficiencies, such solution is required which intervenes only in the most necessary cases in order not to decrease the realistic level of the operation. According to our interpretation, during the assembly operations the absolute value of the position difference of the vectors of the fingertips per one second provides the trembling phenomenon of the hands which makes difficult the task. In further, this scalar value between 0 and 1 defines that relative distance of the components at which they will be automatically inserted to their position. The importance of this step should be highlighted, because the algorithm, considering the actual environment variables, looks for that smallest distance in which the user needs assistance. The event manager provides only such assistance which is necessary due to the hardware deficiencies. With a method developed for measurement of the grip force, the objects can be gripped not only from the top direction, but the direction of larger size objects also can be controlled during the assembling. In contrast to the digital gloves the LEAP Motion models the virtual movement of the hand in-contact way, it does not use resistance-based kinematic-sensors. This is an advantage in many ways; however, it also proposes a serious problem: at the description of the physical properties of the hands not only the markers based coordinates should be considered, but the pressure/force should be also taken into account. When the colliders of the fingers reach the polygon of the objects, a subordinated script resets its absolute position based on the diameter of the widest point of the object. This data is interpreted as zero point in the future; the positive and negative difference will be the numerical data of the grip force. If the measured value is 100%, the gravitational force acting at the end of the object will be decreased to zero, and the bolt remains in the same position as at first it was gripped and lifted up [7]. Look now some fundamental interactions in order to understand how the brain recognizes the virtual environment. Let s switch on the light: doing so a trigger box

is defined for the mesh collider of the animated switch, in such a way avoiding the physical contact between the two objects (hand and switch), and reducing the number of errors and improves the user experience. In case of the interaction (switched on light), the clicking sound of the mechanics and the transforming switch together provide such an audio-visual feedback for the user that the human brain can link together the storyline, naturally sense the events without having actually physical interaction, without touching the surface of the switch by fingers (Fig. 4). Figure 4 Switch on/off the light This simple interaction is an excellent example for our response to the largest actual problem of the virtual reality: developments of most of the available VR hardware are infancy, so many clumsy solutions appear due to inaccurate sensors, and incorrectly mapped movements which are inevitably detract the user experience. With this the reality of the simulation for the user becomes questionable; however, the immersion is the base of the raison d'etre of the virtual reality [2]. The software development can be the solution for getting around the hardware deficiency determined by the market, based on a simple thesis: that is true what our brain believes so. Task of opening a door provides answers to other general questions: in this experiment (in sharp contrast to the previous example), a door can be opened using purely physical interaction of two colliders. So the door should be simply pushed with our hands until it opens. This is very important because in this case the haptic feedback has an important role in the reality, from this feedback the user feels how large force is necessary to open the door [3]. The haptic feedback is not available yet for the virtual reality, however, the brain is able to assess and sense the haptic feedback based on visual signals.

3. VIRTUAL ASSEMBLY Three interconnected tools (OR, CV, LM) together create scheme about the reality, their digital extracts are combined into one character (avatar) to generate the complex motion of the avatar of the user. The system of the connected devices transforms the real world movement to the interaction between the complex character and virtual space. In the hangar a gas turbine is placed, on which the assembly training can be exercised [4], (Fig. 5). First of all, the cover of the opening of the turbine casing can get down ensuring free way to the mechanics, creating the possibility for the interaction. The gearbox can be completely disassembled following logical order as in the reality. From the lower gear box the gear can be taken out only using two hands (lower picture in Fig. 5): in this case a script checks based on the hierarchy states that two colliders should be in contact with the gear in the same time. If both hands are in contact with the given object, the item will move together with the left hand, and it can be removed. In case of assembly the components can jump to their position from a relatively small distance. Figure 5 Assembly of a gear using two hands in virtual reality

4. DISCUSSION The LM is an optical sensor [5]; therefore, the possible error factors can be significantly increased. The infrared background radiation reduces the contrast detected by the sensor, thus, to distinguish the fingers in the background is more difficult for the algorithm during the image analysis. Further problems are the blind spots and the small interaction area which does not cover the area of the activities, so the user can be out of the range of the sensor. The latter problem can be solved to mount the LM sensor to the flange of the rotating element of the Virtualizer, so the user is not disturbed in the free movement and during his/her turning the hands remain in the interaction area. Alternative solution can be to mount the LM sensor to the front panel of the OR since the user follows the assembly with his/her eyes during the manual work. However, due to the movement of the head the data from the LM sensor about hands become false influencing and often distorting to unusability the interpretation of the incoming data [6]. Thus, an absolute reference system is used based on the data extracted from the accelerometer and gyroscope of the OR (IMU module), where the vector values of the OR movements are added to the measured values of the LM obtained from the image analysis of the hands in order to compensate each other, so that the head movements do not affect the position of the hands in the virtual space. After this, all the movements are absolute values in the obtained space. To solve the problem of the blind spots due to the viewing angle and the problem of the false signals two LM sensors can be connected [7]. However, connection of them generates further problems for what the developments are in progress. 5. CONCLUSIONS Our goal is to increase the safety of the trainee and, the efficiency of the training method and reduce the costs of the education methods spent on training. The pilot application which combines the spatial and fine motor movement using combined hardware and software tools, the system provides realistic and interactive virtual space for training. It is possible to walk in a room or on the territory of the power plant, switch on/ off the light, open and close the doors, control the fan speed, and study the assembly work of a turbine. The user can be trained in a realistic power plant environment and does not disturb the on-site work and he/she is also not exposed unnecessarily at danger. This pilot application proves the applicability of the virtual reality in the industry; the virtual reality is suitable alternative for creating a versatile industrial training system. 6. ACKNOWLEDGEMENT The research has been funded by the EITKIC_12-1-2012-0008 project in the frame of the Grant Program for Support the Hungarian participation in the Knowledge and Innovation Communities (EIT KIC).

7. REFERENCES [1] Dave, D., Chowriappa, A. és Kesavadas T.: Gesture Interface for 3D CAD Modeling using Kinect. Computer-Aided Design, 2012 [2] David M. Eagleman: Human time perception and its illusions, Current Opinion in Neurobiology, Volume 18, Issue 2. pp. 131-136, doi:10.1016/j.conb.2008.06.002, 2008 [3] N. Duarte Filho, S. Costa Botelho, J. Tyska Carvalho, P. de Botelho Marcos, R. de Queiroz Maffei, R. Remor Oliveira, R. Ruas Oliveira, V. Alves Hax: An immersive and collaborative visualization system for digital manufacturing. Int J Adv Manuf Technol, 2010 [4] N. Beattiea, B. Horana, S. McKenziea: Taking the LEAP with the Oculus HMD and CAD - plucking at thin air? The International Design Technology Conference, DesTech, 2015 [5] D. Bachmann, F. Weichert, G. Rinkenauer: Evaluation of the Leap Motion Controller as a New Contact-Free Pointing Device. Sensors, 2015 [6] S. M. LaValle, A. Yershova, M. Katsev, M. Antonov: Head tracking for the Oculus Rift. Oculus VR, Inc 2012 [7] https://community.leapmotion.com/t/multiple-device-roadmap/1280,lm Community, 2015