Vocational Training with Combined Real/Virtual Environments

Similar documents
PLC-PROGRAMMING BY DEMONSTRATION USING GRASPABLE MODELS. Kai Schäfer, Willi Bruns

Handling station. Ruggeveldlaan Deurne tel

VIRTUAL REALITY APPLICATIONS IN THE UK's CONSTRUCTION INDUSTRY

HELPING THE DESIGN OF MIXED SYSTEMS

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Application of 3D Terrain Representation System for Highway Landscape Design

FluidSIM 4 The training-all-rounder

INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components

The use of gestures in computer aided design

HUMAN COMPUTER INTERFACE

Conveyor station. Ruggeveldlaan Deurne tel

IMMERSIVE VIRTUAL REALITY AS AN INTERACTIVE TOOL FOR CABLE HARNESSES DESIGN. James RITCHIE*, John SIMMONS*, Patrik HOLT**, George RUSSELL*

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Elicitation, Justification and Negotiation of Requirements

RESEARCH PROJECTS 28

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Introduction. Unit 1. Unit 2. Unit 3

The Resource-Instance Model of Music Representation 1

Mixed reality learning spaces for collaborative experimentation: A challenge for engineering education and training

Move with science and technology

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

DESIGN TYPOLOGY AND DESIGN ORGANISATION

Years 9 and 10 standard elaborations Australian Curriculum: Design and Technologies

Eye-centric ICT control

GameBlocks: an Entry Point to ICT for Pre-School Children

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Paint with Your Voice: An Interactive, Sonic Installation

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Virtual Tactile Maps

Human Computer Interaction (HCI, HCC)

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

Years 5 and 6 standard elaborations Australian Curriculum: Design and Technologies

Graspable Interfaces as Tool for Cooperative Modelling

Advancements in Gesture Recognition Technology

CHAPTER 1. INTRODUCTION 16

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

Virtual Environments. Ruth Aylett

Chapter 1 Virtual World Fundamentals

Sensible Chuckle SuperTuxKart Concrete Architecture Report

Embodiment, Immediacy and Thinghood in the Design of Human-Computer Interaction

elaboration K. Fur ut a & S. Kondo Department of Quantum Engineering and Systems

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

AGRICULTURAL TECHNOLOGY GUIDELINES FOR PRACTICAL ASSESSMENT TASKS

VIRTUAL REALITY TECHNOLOGY APPLIED IN CIVIL ENGINEERING EDUCATION: VISUAL SIMULATION OF CONSTRUCTION PROCESSES

Score grid for SBO projects with a societal finality version January 2018

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

Naturalness in the Design of Computer Hardware - The Forgotten Interface?

A Mixed Reality Approach to HumanRobot Interaction

REPRESENTATION, RE-REPRESENTATION AND EMERGENCE IN COLLABORATIVE COMPUTER-AIDED DESIGN

Intelligent interaction

Mobile Interaction with the Real World

COMET: Collaboration in Applications for Mobile Environments by Twisting

KEYWORDS virtual reality exhibition, high bandwidth, video-on-demand. interpretation

Argumentative Interactions in Online Asynchronous Communication

Score grid for SBO projects with an economic finality version January 2019

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Capturing and Adapting Traces for Character Control in Computer Role Playing Games

Gesture Recognition with Real World Environment using Kinect: A Review

R (2) Controlling System Application with hands by identifying movements through Camera

A Kinect-based 3D hand-gesture interface for 3D databases

Interaction via motion observation

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

A Virtual Environments Editor for Driving Scenes

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

BI-DIRECTIONAL ENERGY INTERFACES FOR MIXED REALITY DESIGN VIRTUAL EQUIVALENCE. Yong-Ho Yoo, Wilhelm Bruns

Unit 24: Applications of Pneumatics and Hydraulics Unit code: J/601/1496 QCF level: 4 Credit value: 15

A Model for Unified Science and Technology

Attorney Docket No Date: 25 April 2008

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Virtual Grasping Using a Data Glove

AR Tamagotchi : Animate Everything Around Us

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Designing Semantic Virtual Reality Applications

Beyond: collapsible tools and gestures for computational design

A Dynamic Gesture Language and Graphical Feedback for Interaction in a 3D User Interface

Effective Iconography....convey ideas without words; attract attention...

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

Augmented Reality And Ubiquitous Computing using HCI

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Intelligent Modelling of Virtual Worlds Using Domain Ontologies

Relation-Based Groupware For Heterogeneous Design Teams

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE

Enabling Cursor Control Using on Pinch Gesture Recognition

Towards affordance based human-system interaction based on cyber-physical systems

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

Definition. Mind-set. Tool. Process

Implementation of a Choquet Fuzzy Integral Based Controller on a Real Time System

Multi-Modal User Interaction

ScrollPad: Tangible Scrolling With Mobile Devices

Physical Presence in Virtual Worlds using PhysX

Transcription:

DSSHDUHGLQ+-%XOOLQJHU -=LHJOHU(GV3URFHHGLQJVRIWKHWK,QWHUQDWLRQDO&RQIHUHQFHRQ+XPDQ&RPSXWHU,Q WHUDFWLRQ+&,0 QFKHQ0DKZDK/DZUHQFH(UOEDXP9RO6 Vocational Training with Combined Real/Virtual Environments Eva Hornecker, Bernd Robben Research Centre Work and Technology (artec), University of Bremen Is it possible to interconnect physical learning media and computer-based learning? If so, what do we gain by attempting this? We will describe the conceptual evolution, as well as various results of a project aimed at the development of a computer-supported learning environment for pneumatics, targeted at vocational students. This environment uses a new approach to human/computer interaction, which involves coupling the building of models in the real space of physical objects with the virtual space of signs and images. The project EUGABE (sponsored by German Research Commission DFG) interconnects traditional workbenches with computer-based simulators. While the student constructs a physical circuit, the computer tracks this construction. It then generates a symbolic circuit, which can be digitally simulated. As users introduce new elements, or relocate existing elements, symbols pop up in the simulator, or they move. 1 HCI Leaving the Desktop: Real Models as Interface This concept of a graspable user interface is based on the pairing of real artifacts with their virtual counterparts. Students wear DataGloves and use real objects to construct a system on a workbench, while the computer tracks and interprets their actions and gestures. It simultaneously assembles a corresponding virtual model, responding to the actions of the students hands. Virtual objects may also contain further information required either for creating computer simulations, or for obtaining specific information about elements. The virtual model can be connected to application-specific simulators. Experiments can be performed in either the physical system or the virtual system. Users are able to switch freely between operations on both the real and the virtual objects. The material workspace functions as an interface for the virtual model, and controls it. The computer acts as a supporting tool. The hands-on manipula-

tion of concrete physical models allows for an intuitive and nonpropositional way of modelling; it provides unmediated communication. Our coupling of real and virtual models combines two approaches: an abstract dispassionate approach (usually supported with technical modelling tools); and an intuitive, playful - even meditative - approach. (Bruns 1996) We call this concept Real Reality since it concentrates on real models. 2 System Development and Design Space Constraints The concept of Real Reality was a prevalent principle in several projects at artec (cp. Schäfer et al. 1997, Bruns 1998). In this discussion we will concentrate on its application to pneumatics. Our vision of an integrated learning environment required more than simply the simultaneous production of the virtual circuit. It also demanded simultaneous interaction with the computer simulation, and its help-system. It also needed to be able to access several levels of visualisations while concurrently dealing with the dynamic nature of the concrete physical model. The system was designed to run on modestly-equipped PCs supplemented by a limited range of additional hardware. Technical development proceeded in several stages. We began by coupling our system with an existing digital simulator which we extended to include pneumatics. The correlation of the movement of real elements to the movement of virtual elements is achieved by tracking the movements of the students'

hands. Since the electro-magnetic tracking system currently in use functions poorly with metal objects, we use wooden bricks whose size corresponds to the original pneumatic elements. Future plans include a different tracking system which would allow the use of metal objects. The use of wires or tubes created special problems: these items are flexible and cannot be tracked well. We decided to use two bricks for the endings of tubes. The same idea detection of tube ends instead of the whole tube has been implemented in an European Community project developed by artec (BREVIE). Differing from EUGABE, this system uses image recognition via video cameras to recognise physical objects, and to correlate physical constructions with virtual models. In this first stage, we proved that it is possible to build a digital or pneumatic circuit with tangible symbols, and to then transfer its model to the simulator. We also developed a file-transfer interface for the commercial simulator FluidSim which allows the student to import an existing model into the simulator. Since our goal was to offer simultaneous, dynamic modelling, we specified that pneumatic symbols should appear at the moment the user has picked an element. To this end, we developed an online interface. This instantaneous interaction enables the student to use gestures to send commands (Run, Stop, Help) to the simulator. Pointing at an element calls help on it. During a simulation this action is interpreted as activating a toggle switch. We encountered several constraints in dealing with the design space. The available hardware led to restrictions, some of which could be mastered by using higher-quality (more expensive) devices. This holds for the tracking system as well as for the data gloves, which are either too clumsy or too expensive. The commercial development of data gloves is motivated primarily by the demands of Virtual Reality applications. Optimally, the data gloves should also be wireless. The brick-solution for tubes led to restrictions on student behaviour, i.e. to complete one tube before using a new one. The development of a vocabulary of gestures demanded solutions which were appropriate to both technical and ergonomic needs. Which gestures can be used? How many suffice for interacting with the system? Human gesture language as a whole is far too vague and complex for our needs. Those gestures which are intuitive and easy to discern are those of gripping, moving, releasing grip and pointing. Therefore, we propose to restrict the gestures we use, exclusively, to a few easily-learnable gestures, harmonised with their context. We have not yet arrived at a complete system for recognizing actions in the physical model. For instance, when the student tunes a valve, that gesture is currently too subtle to be detected. Activating a switch by pointing to it works well for button switches, but not for switches which have handles to be turned. With better gesture recognition, and more sophisticated modelling, these problems could be solved in the future. The movements of certain parts (the

pulling of a cylinder; the tuning of a valve) require the use of hierarchical models in order to accurately represent them. The use of image recognition in the manner of the BREVIE project elegantly solves some of these problems, while, simultaneously, creating new ones. Since the detection of gestures is extremely difficult, gestures cannot be used to interact with the BREVIE-System. If the use of gestures were to be replaced by the use of a pointing device of some sort, the inherent value of unmediated gesturing would have to be scrutinised. 3 Didactic Gains Artec s learning environment is primarily designed for introductory courses in pneumatics. The primary educational goals are: to learn to build simple pneumatic systems, to develop an understanding of where and how such systems are implemented in industrial environments, to comprehend the physical laws pertaining to the behaviour of gases, to achieve a basic understanding of pneumatic systems design, to learn the details of specific pneumatic elements: their designation, their notation in schematic diagrams, how they function. The learning environment is meant to aid the students in their quest to make the many necessary mental jumps to higher levels of abstraction demanded by the realities of pneumatic systems. Such jumps require constant adjustment, according to the changing requirements of the system, and the many-faceted requirements of field of pneumatics. The basic point of view offered to the student, for the comprehension of a pneumatic system, is the physical model which exists on the workbench. This opportunity to perceive concrete objects makes possible the conceptual understanding of the system. Dealing with pneumatic elements in a hands-on situation allows students to confront the day-to-day problems encountered in the professional world in a way which would never be possible by using a purely digital environment. The knowledge gained here is tacit knowledge. We value this premise so highly, that our design concepts concerning computer-based learning differ from usual approaches. Utilising our environment, computer-aided learning offers a seemingly-direct correspondence between the physical environment and those representations displayed by the computer. The most obvious of those is the realistic-looking 3-D representation which rebuilds automatically. If this computer representation were the only one, there would be few didactic advantages. Insight can first be gained when the student recognises the differences between two representations. A 3-D representation then becomes interesting when it offers integration into the larger context of real world -applications. Even more important are those representations which are generated, whose nature is more abstract. The student

is offered schematic representations in the Editor and in the Simulator. The student can use a walk-through diagram to help visualise the behaviour of a system over the course of time. In many cases, graphic representations show animated cut-away representations of pneumatic components, allowing the student to quickly grasp how they function, and - in some cases - allow the student to perform experiments on them. In order for the system to translate changes into the various other representations, it is not only necessary to react to the physical pneumatic system itself, it is also crucial that the various representations react - in synch - to changes in all the other representations. An interactive interface between the walk-through diagram and the 3-D representation provides the student with a high level of clarity. Differing from interaction with electromagnetic circuits, it is not currently possible to cause changes to take place in the physical environment, in response to changes in a digital representation. We have, however, devised a method to double check a newly-built physical construction. The system verifies whether or not there is a direct correspondence with a digital representation within the Editor. In addition to striving to reach our didactic goals, our learning environment is designed to nurture an open educational environment. As opposed to tutorial systems our learning environment encourages copious experimentation and offers teachers abundant alternatives as to how to organise the learning process. Wishing to prevent media-breaks or seams, we experimented with projecting the display of the simulation onto the wall. This enables students to remain close to the workbench, staying focussed on the physical construction, while upholding the concept of group participation - the workbench is treated as a shared workspace. Since the system can utilise several data gloves, working in parallel is possible. 4 References Bruns, F. W. (1998). Integrated Real and Virtual Prototyping. In Proc, 24th Conference of the IEEE Industrial Electronics Society (IECON 98, Aachen). IEEE. Vol.4, 2137-2142 Bruns, F. W. (1996). Grasping, Communicating, Understanding Connecting Reality and Virtuality. AI & Society 10/1. London: Springer. 6-14 Robben, B.; Hornecker, E; Bruns, F.W. (1998) Lernen und BeGreifen Pneumatik unterrichten in einer Real Reality Umgebung. Artec paper 62. Bremen Schäfer, Kai; Brauer, Volker ; Bruns, F.W. (1997). A new Approach to Human- Computer Interaction Synchronous Modelling in Real and Virtual Spaces. In: Designing Interactive Systems, (DIS 97 Amsterdam). ACM, 335-344