Interactive Virtual Environments

Similar documents
VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Robotic, Human, and Symbiotic Sensor Agents

Virtual Environments. Ruth Aylett

Haptic Sensing and Perception for Telerobotic Manipulation

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Symbiotic Human-Computer Interaction

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

INTUITION Integrated Research Roadmap

Development of a telepresence agent

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

Short Course on Computational Illumination

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Virtual Grasping Using a Data Glove

The Mixed Reality Book: A New Multimedia Reading Experience

Complementary Tactile Sensor and Human Interface for Robotic Telemanipulation

CSC 2524, Fall 2018 Graphics, Interaction and Perception in Augmented and Virtual Reality AR/VR

The Application of Virtual Reality Technology to Digital Tourism Systems

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Why interest in visual perception?

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Novel machine interface for scaled telesurgery

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

International Journal of Advanced Research in Computer Science and Software Engineering

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

A*STAR Unveils Singapore s First Social Robots at Robocup2010

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

Introduction to Virtual Reality. Chapter IX. Introduction to Virtual Reality. 9.1 Introduction. Definition of VR (W. Sherman)

Hand Gesture Recognition Using Radial Length Metric

Booklet of teaching units

CAPACITIES FOR TECHNOLOGY TRANSFER

Activities at SC 24 WG 9: An Overview

FALL 2014, Issue No. 32 ROBOTICS AT OUR FINGERTIPS

Bio-Inspired Robot Sensing and Control Solutions

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

Interface Design V: Beyond the Desktop

Advanced Man-Machine Interaction

Mid-term report - Virtual reality and spatial mobility

Perceptual Interfaces. Matthew Turk s (UCSB) and George G. Robertson s (Microsoft Research) slides on perceptual p interfaces

Subject Description Form. Upon completion of the subject, students will be able to:

VR Haptic Interfaces for Teleoperation : an Evaluation Study

Computer Vision in Human-Computer Interaction

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Mission Space. Value-based use of augmented reality in support of critical contextual environments

Modeling and Simulation: Linking Entertainment & Defense

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

INTELLIGENT UNMANNED GROUND VEHICLES Autonomous Navigation Research at Carnegie Mellon

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Computer Assisted Medical Interventions

Using Hybrid Reality to Explore Scientific Exploration Scenarios

Haptics CS327A

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

2. Introduction to Computer Haptics

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

FORCE FEEDBACK. Roope Raisamo

Haptic Sensors and Interfaces

VR based HCI Techniques & Application. November 29, 2002

Multi-Modal User Interaction

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

ELG 5121/CSI 7631 Fall Projects Overview. Projects List

RoboCup. Presented by Shane Murphy April 24, 2003

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

By: Celine, Yan Ran, Yuolmae. Image from oss

Realtime 3D Computer Graphics Virtual Reality

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

Application of 3D Terrain Representation System for Highway Landscape Design

Haptics Technologies: Bringing Touch to Multimedia

Toward an Augmented Reality System for Violin Learning Support

How is a robot controlled? Teleoperation and autonomy. Levels of autonomy 1a. Remote control Visual contact / no sensor feedback.

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Gesture Recognition with Real World Environment using Kinect: A Review

Chapter 1 - Introduction

Introduction to Virtual Reality (based on a talk by Bill Mark)

Medical Robotics. Part II: SURGICAL ROBOTICS

Craig Barnes. Previous Work. Introduction. Tools for Programming Agents

Omni-Directional Catadioptric Acquisition System

FP7 ICT Call 6: Cognitive Systems and Robotics

YOUR GATEWAY TO ENDLESS OPPORTUNITIES

Computer Haptics and Applications

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Exploring Surround Haptics Displays

Collaboration en Réalité Virtuelle

Augmented Reality Mixed Reality

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr.

UMI3D Unified Model for Interaction in 3D. White Paper

Mobile Haptic Interaction with Extended Real or Virtual Environments

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

Affordance based Human Motion Synthesizing System

Waves Nx VIRTUAL REALITY AUDIO

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

Transcription:

Interactive Virtual Environments Introduction Emil M. Petriu, Dr. Eng., FIEEE Professor, School of Information Technology and Engineering University of Ottawa, Ottawa, ON, Canada http://www.site.uottawa.ca/~petriu May 2008

(Abstract) Modern virtual environments allow us to capture and manipulate multimodal - visual, auditory, haptic or other nature - sensor-based and/or synthetic models of 3D objects and phenomena. The behaviour and evolution of these object and phenomenon models can be driven by physical sensor data or by animation scripts. Human users can interact and directly manipulate objects within the virtual environment. This technology has already found promising applications in industry, communications, tele-robotics, medicine and healthcare, security and entertainment.

The first part of the course will discuss video and haptic sensor systems for multimodal acquisition and human-computer interfaces, as well as real-time NN modeling of 3D object shape and elastic behaviour. 3D tracking, modelling and animation techniques for human face, body and gestures will also be presented. The second part of the course will address basic principles and discuss scenarios of symbiotic human-computer interaction for the next evolutionary stage of the computing technology. It will be a symbiotic partnership where humans will contribute humanspecific capabilities complementing those of the computers. The leader/assistant role of the human and the computer partner, respectively, will be decided on the basis of maximizing the overall efficiency of the symbiotic team. Reference [1] G. Burdea and Ph. Coiffet, Virtual Reality Technology, (2 nd edition with CD-ROM), Wiley, New Jersey, 2003, (ISBN 0471360899).

Natural and Virtual Reality Virtual Reality Interactive Virtual Reality Virtualized Reality Augmented Reality

HUMAN PERCEPTION OF REALITY HUMAN (sentient living animal able of sensible reasoning) REAL WORLD / ENVIRONMENT

Human Sensing Real/Material World

Model of the real world perceived by the human Neural Network / brain trough sensory organs Reality Perception by Humans as a Neural Network Process Real/Material World

1) Relax and concentrate on the 4 dots in the middle of the picture for aprox. 30-40 secs. 2) Then, take a look at a wall near you (resp. any smooth, single coloured surface) 3) You will see a circle of light developing 4) Start blinking your eyes a couple of times and you will see a figure emerging.

What Is Virtual Reality? A Web-Based Introduction Version 4 Draft 1, September, 1998 Jerry Isdale http://www.isdale.com/jerry/vr/whatisvr.html

>>> What Is Virtual Reality? -A Web-Based Introduction, by Jerry Isdale 1. Overview 2. A Taxonomy of Virtual Reality 3. Types of VR Systems 4. VR Hardware 5. Levels of VR Hardware Systems 6. VR System Technical Approaches 7. Available VR Software Systems 8. Aspects of A VR Program 9. Other Senses 10. World Space 11. World Database 12. World Authoring versus Playback 13. World Design 14. Fiction Books Related to VR

Virtual Environments allow humans to visualize, manipulate and interact with computer models and extremely complex data. Computer generated visual, auditory, force or other sensory outputs to the human user can be mixed with the sensor based models of the real world to generate a virtual world within the computer. This virtual environment (VE) may be a CAD like model, a scientific simulation, or a view into a database. The users can interact and directly manipulate objects within VE. Some virtual environments are animated by other processes, simulations, or simple animation scripts. VE technology has already found applications in industrial design, communications, telerobotics, scientific research, medicine, training & education, and entertainment.

Types of VR Systems according to [Isdale] Window on World Systems (WoW), or Desktop VR. Video Mapping. variation of the WoW approach where the user watches a monitor that shows his body s silhouette interaction with the world. Immersive Systems. completely immerse the user's personal viewpoint inside the virtual world. These "immersive" VR systems are equipped with Head Mounted Displays (HMD), or a 'Cave' or room in which the viewer stands. The Holodeck used in the "Star Trek: The Next Generation" is an extrapolation of this technology.

Types of VR Systems according to [Isdale] Telepresence. links remote sensors in the real world with the senses of a human operator. Applications => remote sensors might be located teleoperated robots for fire fire fighting, space or undersea operations. Surgeons are using instruments on cables having a video camera at the point of operation. Mixed Reality, or Seamless Simulation, mixes the Telepresence and Virtual Reality systems. The computer generated inputs are merged with telepresence inputs and/or the users view of the real world. Applications => surgeon's view of a brain surgery is overlaid with images from earlier CAT scans and real-time ultrasound. A fighter pilot sees computer generated maps and data displays inside his HMD.

Virtualized Reality Prof. Takeo Kanade, Robotics Institute, Carnegie Mellon University, Pittsburgh, PA, USA http://www.cs.cmu.edu/~virtualized-reality/ In contrast to virtual reality, in which synthetic environments are created, Virtualized Reality is based on events taking place in the real world, which are captured and processed by computer manipulation,

>>> Virtualized Reality Prof. Pierre Boulanger, University of Alberta, Edmonton, AB, Canada http://www.cs.ualberta.ca/~pierreb/ Virtualized reality is a generalization of the standard visual simulation paradigm where the model and the actions used in the simulated world are extracted from various sensors and information retrieval systems. The resulting visual simulation aims at an exact representation of the real world allowing for photo realistic rendering, telepresence, remote control, and intuitive information queries.

Augmented Reality & Computer Augmented Environments Prof. Jim Vallino, Departments of Computer Science and Software Engineering Rochester Institute of Technology, Rochester,USA, http://www.se.rit.edu/~jrv/research/ar/ The basic difference between the Augmented Reality and the Virtual Reality is the immersiveness of the system. Virtual reality strives for a totally immersive environment. The visual, and in some systems aural and proprioceptive, senses are under control of the system. In contrast, an augmented reality system is augmenting the real world scene necessitating that the user maintains a sense of presence in that world. The virtual images are merged with the real view to create the augmented display. There must be a mechanism to combine the real and virtual that is not present in other virtual reality work

>>> Augmented Reality & Computer Augmented Environments SONY Links on Augmented Reality Projects http://www.csl.sony.co.jp/project/ar/ref.html US Department of the Navy, Office of Naval Research, Battlefield Augmented Reality System (BARS) US National Tele-Immersion Initiative http://www.advanced.org/teleimmersion.html Tele-Immersion (National Tele-immersion Initiative - NTII) will enable users at geographically distributed sites to collaborate in real time in a shared, simulated environment as if they were in the same physical room. This new paradigm for human-computer interaction is the ultimate synthesis of networking and media technologies and, as such, it is the greatest technical challenge for Internet.

E.M. Petriu and T.E. Whalen, " Computer-Controlled Human Operators," IEEE Instrum. Meas. Mag., Vol. 5, No. 1, pp. 35-38, 2002 HUMAN COMPUTER INTERACTION SENSORS SENSORS ACTUATORS Interfacing virtual environments with the real world and human operators

HUMAN COMPUTER INTERACTION SENSORS SENSORS ACTUATORS Human interaction with the real world and in augmented virtual reality

Commercial Virtual Hand Toolkit for CyberGlove/Grasp, Head Mounted Display, and see through visual display

Optical see-through display Video Camera Human operator using augmented virtual reality in a structured real world.

Computer vision recognition of the pseudo-random binary code, which is then used as augmented reality information.

Neural Network Classification of Brain-Computer Interface Data for the Telecontrol of Symbiotic Sensor Agents The Brain-Computer Interfaces (BCI) system is based on the well-known oddball paradigm that uses a positive deflection in EEG signal of about 300ms (P300) after rare expected stimuli is evoked. The advantage is that subjects do not have to be trained to generate the P300 effect as it occurs naturally in human subjects. We are using auditory stimuli to generate the P300 responses and a less computationally intensive MLP feed-forward NN for the classification of the EEG responses. In our experimental setup a human teleoperator equipped with visual and audio HCI, and a BCI controls at the strategic level the movements of an intelligent semi-autonomous RSA equipped with an on board camera and three IR sensors that semi- autonomously navigates through a maze using a tactical-level obstacle-avoidance algorithm

Virtual Operation Theater OBJ (N)... OBJ (1)... OBJ ( j ) 3D Geometric& Elastic Composite Model of Object {( x, y, z,e ) p = 1,..,P } p p p p AVATAR HAND ( k ) Application Specific Interactive Action Scenario { [ 3D(j) & F(k,j) ], t } Composite HapticInteraction Vector between User (k) and Object (j) Haptic Human Interface USER (k) Haptic Robot Interface ROBOT(k) NETWORK OBJ(i) CyberGrasp CyberTouch Robot Arm Controller Tactile Sensor Interface Interactive Model-Based Hapto-Visual Teleoperation - a human operator equipped with haptic HCI can telemanipulate physical objects with the help of a robotic equipped with haptic sensors.

SCRIPT-BASED & GESTURE-BASED INTERACTIVE AVATAR ANIMATION Computer Generated Objects Object Interaction Models ANIMATION SCRIPT Object Shape & Behavior Models Sensor Data Fusion & Interpretation Virtual Object Manipulation Motion Tracking Object Recognition Virtual_Environment / Real_World Interfaces Avatar _S! Visual Feedback(s) Video Sensor(s) Structured Light Audio Feedback(s) Audio Sensor(s) Avatar _Sn Avatar _H i Tactile Feedback(s) Tactile Sensor(s) VIRTUAL SCENE Force Feedback(s) HUMAN PUPPETEER Force Sensor(s)

GESTURE-BASED INTERACTIVE AVATAR ANIMATION Object Shape & Behavior Models Sensor Data Fusion & Interpretation Virtual Object Manipulation Motion Tracking Object Recognition Virtual_Environment / Real_World Interfaces Visual Feedback(s) Video Sensor(s) Structured Light Audio Feedback(s) Audio Sensor(s) Avatar _H i Tactile Feedback(s) Tactile Sensor(s) VIRTUAL SCENE Force Feedback(s) HUMAN PUPPETEER Force Sensor(s)

SCRIPT-BASED INTERACTIVE AVATAR ANIMATION Computer Generated Objects ANIMATION SCRIPT Object Interaction Models Object Shape & Behavior Models Virtual Object Manipulation Avatar _S! Avatar _Sn VIRTUAL SCENE

Avatar / Machine level Instructions INTERPRETER/COMPILER INVERSE KINEMATIC CONTROL 3-D ARTICULATED AVATAR Face Modell (Facial Action Coding ) Body Model (Joint Control ) Face muscleactivation instructions Jointactivation instructions Story-level Instructions Voice synthesizer ANIMATION SCRIPT

STORY-LEVEL INSTRUCTIONS.. DaneelA sits on the chair#4. DanielA writes Hello on stationary. He sees HappyCat under the white table. DaneelA starts smiling. HappyCat grins back. SKILL-LEVEL ( MACRO ) INSTRUCTIONS.. DanielA s right hand moves the pen to follow the trace representing H. DanielA s right hand moves the pen to follow the trace representing e. DanielA s right hand moves the pen to follow the trace representing l. DanielA s right hand moves the pen to follow the trace representing l. DanielA s right hand moves the pen to follow the trace representing o.

SKILL-LEVEL ( MACRO ) INSTRUCTIONS DanielA s right hand moves the pen to follow the trace representing H. DanielA s specific style of moving his right arm joints to write H ( NN model capturing DanielA s writing personality ) Rotate Wrist to a i Rotate Elbow to b j Rotate Shoulder to g k 3-D Model of DanieA s Right Hand Shoulder z Elbow y x Wrist