Cosc VR Interaction. Interaction in Virtual Environments

Similar documents
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

CSC 2524, Fall 2017 AR/VR Interaction Interface

3D Interaction Techniques

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.

Guidelines for choosing VR Devices from Interaction Techniques

Interaction in VR: Manipulation

CSE 165: 3D User Interaction. Lecture #11: Travel

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Panel: Lessons from IEEE Virtual Reality

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

Chapter 1 - Introduction

Virtual Environments: Tracking and Interaction

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques


ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Realtime 3D Computer Graphics Virtual Reality


Input devices and interaction. Ruth Aylett

Issues and Challenges of 3D User Interfaces: Effects of Distraction

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

3D interaction strategies and metaphors

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

VR System Input & Tracking

3D interaction techniques in Virtual Reality Applications for Engineering Education

The architectural walkthrough one of the earliest

AUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

3D UIs 101 Doug Bowman

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Mid-term report - Virtual reality and spatial mobility

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Simultaneous Object Manipulation in Cooperative Virtual Environments

Interactive and Immersive 3D Visualization for ATC

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Collaboration en Réalité Virtuelle

TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN

Jerald, Jason. The VR Book : Human-centered Design for Virtual Reality. First ed. ACM Books ; #8. New York] : [San Rafael, California]: Association

VR/AR Concepts in Architecture And Available Tools

NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS

Interactive and Immersive 3D Visualization for ATC. Matt Cooper Norrköping Visualization and Interaction Studio University of Linköping, Sweden

User Interface Constraints for Immersive Virtual Environment Applications

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Wayfinding. Ernst Kruijff. Wayfinding. Wayfinding

Approaches to the Successful Design and Implementation of VR Applications

Testbed Evaluation of Virtual Environment Interaction Techniques

Interface Design V: Beyond the Desktop

Augmented and mixed reality (AR & MR)

Effective Iconography....convey ideas without words; attract attention...

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

Look-That-There: Exploiting Gaze in Virtual Reality Interactions

PERCEPTUAL AND SOCIAL FIDELITY OF AVATARS AND AGENTS IN VIRTUAL REALITY. Benjamin R. Kunz, Ph.D. Department Of Psychology University Of Dayton

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality

Virtual Reality and Natural Interactions

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Virtual Reality Calendar Tour Guide

Affordances and Feedback in Nuance-Oriented Interfaces

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

The Control of Avatar Motion Using Hand Gesture

EVALUATING 3D INTERACTION TECHNIQUES

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Input devices and interaction. Ruth Aylett

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

What was the first gestural interface?

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Chapter 1 Virtual World Fundamentals

Geo-Located Content in Virtual and Augmented Reality

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Immersive Guided Tours for Virtual Tourism through 3D City Models

Exploring the Benefits of Immersion in Abstract Information Visualization

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Explorations on Body-Gesture based Object Selection on HMD based VR Interfaces for Dense and Occluded Dense Virtual Environments

General conclusion on the thevalue valueof of two-handed interaction for. 3D interactionfor. conceptual modeling. conceptual modeling

Haptic, vestibular and other physical input/output devices

NAVIGATION TECHNIQUES IN AUGMENTED AND MIXED REALITY: CROSSING THE VIRTUALITY CONTINUUM

Direct Manipulation. and Instrumental Interaction. Direct Manipulation

Experiments in the Use of Immersion for Information Visualization. Ameya Datey

EMPOWERING THE CONNECTED FIELD FORCE WORKER WITH ADVANCED ANALYTICS MATTHEW SHORT ACCENTURE LABS

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Immersive Simulation in Instructional Design Studios

Touch & Gesture. HCID 520 User Interface Software & Technology

Virtual Environments. Ruth Aylett

Transcription:

Cosc 4471 Interaction in Virtual Environments VR Interaction In traditional interfaces we need to use interaction metaphors Windows, Mouse, Pointer (WIMP) Limited input degrees of freedom imply modality in the interface Abstractions of behaviours (sliders, dialog boxes, ) Requires a cognitive model of user interface and mapping of goals to user actions Norman (1986) Gulf of execution - the psychological gap that must be crossed to manipulate a user interface appropriately: goals -> plans -> action specification -> operate interface * Gulf of evaluation - the psychological gap that must be crossed to interpret a user interface display: interface -> interpretation -> evaluation. * * (www.usability.com/glossary) 1

VR Interaction Metaphor In a VE we present immersive displays that user perceives track and represent the participant s actions in the VE model environments with a natural behaviour and reaction to participant actions VR Interaction Metaphor Natural and intuitive user interaction believed to promote presence Generalization of WYSIWYG concept More effective? Direct, ideally narrow gulf of execution and gulf of evaluation Can take advantage of natural, everyday skills and immediate and natural feedback 2

VE s enable direct manipulation and interaction with environment as opposed to abstract metaphors In an immersive VE we can have a different interaction model Interactions centered on the user s body ( Egocentric ) Natural interactions possible using the affordances provided by real environments Need for modality reduced constraints of the VE (gravity, collision detection) flexibility of the interface (i.e. no need to separate selection from manipulation) 3

Limitations in VR interaction metaphor Direct metaphor limited Limitations on 3D tracking, models, display Little API support (standards, tools) Physical effort and fatigue Restricted by same constraints as real world Conversely, poor support for real world constraints Increased effort, difficulty with precision, unnatural Poor fit or inefficiency performing some tasks with direct interaction (control and configuration tasks) Most fielded VE s have little interactivity or are difficult to use Can explicitly model and use real world constraints in interaction MIVE (Stuerzlinger) models physical constraints and natural grouping behaviour Surfaces of support Push together metaphors Limitations in VR interaction metaphor Often need magic interaction as well as natural Allows interactions not possible in real world Longer range interaction Teleporting Allows compensation for limited tracking ability and ability of displays to support real world constraints Requires training on metaphor Use in high-fidelity training situations must be limited In non-realistic VE consider the effectiveness of interaction 4

Representing humans in VE interaction Representation of participant s body and actions in the VE Feedback for interaction (egocentric representation of hands, limbs) especially manipulation and selection Improved immersion and presence Automatic with see-through HMD and CAVE Need virtual (graphical) representation in HMD Representation of other s avatars in a collaborative VE Representing humans in VE interaction Direct interaction relies on correspondence between body and its VE representation Temporal and spatial mapping from human sensors and displays, human effectors and system input devices more straightforward in naturalistic VE s Need to track user s head (hand, body ) in real time in order to respond to head (hand, body ) motion in real time Current tracking does not measure degrees of freedom possible in human motion Burns (2005) IEEE VR2005 5

Visually Coupled Systems Most fundamental representation is participant s view (camera control) A special case of locomotion 3D graphics are rendered from the point of view and pose of a virtual camera Full 6-DOF motion possible Complicated mapping to 2D devices such as mice Typical viewpoint options Virtual camera at avatar s eye (egocentric) Virtual camera fixed in the world (exocentric) Tethered virtual camera (trailing user with an over the shoulder view) Natural view control involves egocentric view and head and gaze movements Natural to look around (rotate and translate head to change view) Reduced learning curve, cognitive demand, hands free 6

Head orientation and position Need to track users head pose Ideally 6DOF pose, often 3D orientation is sufficient for distant objects of interest and restricted head motion (i.e. vehicle simulators) Render view appropriate for viewing direction and vantage point McDowall and Bolas, IPT 2002 critical component for immersive displays supports structure from motion perception maneuvering locomotion active perception situational awareness 7

Instantaneous field of view (FOV) of a display is angle subtended by the image at the eye Field of Regard amount of space (in terms of visual angle) filled by the virtual world an increased effective FOV beyond the instantaneous FOV is provided by tracking head motion Calibration and alignment very important in VCS in order to generate appropriate images much more sensitive to relative misregistration between real and synthetic imagery in AR than absolute misalignment in VR distortion and aberration more apparent in see-through displays than in VR displays distortion causes distortion misjudgment of shape and depth from perspective and motion. Lag ( exponential lag ) in VCS Delay in interaction in a CAVE or helmet-mounted display (HMD) can lead to perceptual and performance degradation Dynamic error - temporal mismatch between expected and actual response Display lag arises from transduction, filtering, transmission, image generation, rendering and display 8

Detrimental effects of delay for interaction Manual control Large lags (> 300 ms) force adoption of nonideal control strategies (move and wait) Degraded tracking or pursuit of a target Instability (pilot induced oscillation) Lag restrains head movements Slows hand-eye or head-eye co-ordination Fitt s law: MT = C1 + C2 (C3+lag) ID Visually guided motor behaviour Reduced ability to recalibrate motor behaviour (prism adaptation) Difficulties with grasp and placement and precision manoeuvring (e.g. ring tracing task degraded with 48ms delay) Visual search time increases with latency Depends on task, for tightly coupled visual simulation, maximum latencies of 40 or 80 ms used as rules of thumb (DIS 100 ms) Simulator or cyber sickness and workload Perceptual stability/oscillopsia VE Interactions Selection Manipulation Navigation Wayfinding Travel and Locomotion System or Application Control (Executive functions) 9

Selection Specifying object(s) in set or scene Indicate target of action Make active for manipulation and other operations Destination for travel At a distance necessarily involves tool metaphors or magic Degree of difficulty affected by distance, size, density object selection methods Contact/Virtual Hand Pointing (GoGo, ray casting) 3D cursor Aperture (occlusion, framing) Indirect (menu, context, spoken naming) world in miniature Need to provide feedback Touch, graphical, auditory Abstract (i.e. auditory icon) or realistic (forces, motion) Bowman and Hodge s Taxonomy 10

Direct Selection Virtual hand directly grabs or touches to select Direct, no magic but only limited range Avatar feedback essential in HMD systems GoGo and Reeling Magic extension to virtual hand GoGo: arm stretches to grab distant objects Reeling: Fishing rod metaphor Poupyrev et al, Eurographics 98 Ray Casting and Laser Metaphors Using hand other pointer cast a ray into scene Natural extension of pointing Action at a distance Feedback through extended pointer or laser beam metaphor Object touched or lit up is selected Intersect in 3D space Sensitive to orientation tracking noise and error due to large moment arm 11

University of Delft, Moldrive Poupyrev et al, Eurographics 98 Steinicke, WSCG 2005 direction selection methods (for selection, direction of travel etc) selection by pointing by gaze direction (or head or torso orientation) by crosshair (selection beam) by valuator by coordinate specification by landmarks 12

Image plane selection (Pierce 1997) Head Crusher Sticky Finger Palm lifting Framing Image plane techniques require user to view VE as both 3D environment Magic picture Manipulation manipulation operations positioning and sizing objects pushing/pulling on objects attribute modification (of object/simulation) controlling travel Tightly coupled to selection in a virtual environment A related an important issue is collision detection and management 13

Gruchalla, IEEE VR2004 Direct Metaphors Hand is modeled with limited degrees of freedom, often as a point no posture, grasp Okay for selection how to support manipulation? Grasp with button, pinch glove After grasp object moves with hand 14

University of Delft, Project Moldrive From Slater et al (2002) Computer Graphics and Virtual Environments 15

Kim & Fellner (2005) More complex direction interactions? Track orientation of hand for pose changes Modal manipulation through secondary input device Tracking of finger flexion for grasp Orientation changes at a distance difficult Bring object to user and use hand centred manipulation Manipulation methods (Mine 1995) Photos from Sherman& Craig, Understanding Virtual Reality 2003 16

Exocentric Manipulation Manipulation from a external view (exocentric) rather than first person (egocentric) World in miniature (WIM) Scaled world grab May break presence immersion indirect WIM does not scale easily and is difficult for precise positioning Mark Mine (1995), Scaled world grab Two handed interaction Guiard, Y (1987 The Journal of Motor Behaviour) stresses asymmetry between dominant and non-dominant hand: Motion of the dominant hand uses nondominant as spatial reference Dominant hand performs finer and smaller scale motions Non-dominant hand motion typically precedes dominant hand motion 17

Navigation Two components to navigating way finding travel Wayfinding Refers to determination of current location and path to desired location Need to maneuver through environment and avoid obstacles when wayfinding (or wandering) need to provide cognitive information to support location determination and wayfinding behaviour (navigational awareness) 18

people believed to form cognitive maps or models of the environment landmark procedural survey In VR we need to be able to navigate about the VE. A typical goal of wearable computers is to provide navigational awareness. Pierce& Pausch, Navigation with place representations and visible landmarks IEEE VR2005 Aids to wayfinding landmarks (natural or artificial) path following/leaving trails maps memorable place names compass/instruments exocentric views coordinate display, grid structure constrained travel 19

With non-immersive and small fields of view Exocentric view preferred for global spatial awareness Egocentric view preferred for local navigation Tethered view as a compromise In immersive applications, Egocentric views are the normal mode Exocentric views used for reorientation, large scale travel Travel Current VR systems allow small tracked work spaces To move to other places in the virtual environment (or in the real world) need to support travel between two places Physical travel fits real world Locomotion on foot is natural form of travel Vehicular travel is also familiar direct or indirect control and locomotion 20

Travel through computer based worlds using mice and joystick is familiar to many (computer games etc) Travel through time is also useful Travel supports Exploration Search Maneuvering Travel requires control of direction and speed of motion Steering Locomotion (real time user control of the virtual camera through VE ) Real travel is limited in a VE- we need a travel metaphor Travel Methods and Metaphors Real locomotion modest distances, limited by tracking capabilities some tricks possible (treadmills, walking in place) typically some real locomotion supported in addition to other techniques Difficulties with terrain following, collision detection and response 21

Route planning metaphor Ride along a preplanned route Tow rope or river metaphor Image from Sherman& Craig, Understanding Virtual Reality 2003 Path to be specified can be done in path planning navigation task Waypoints Icons Path drawing Image from Doug Bowman s dissertation Fly mode most common, lots of freedom direction and speed control A walk through is fly mode constrained to follow the terrain. If not other interaction possible we have a walk-through VE 22

Travel Methods and Metaphors Direction Control (Steering) Gaze, head, or torso orientation Hand pointing or other gesture Tracked devices (wands) Two or three-dimensional input devices: Gamepads and joysticks, 3D input devices,wands Speed control: constant velocity, gesture, input device, select from list Can drive vehicles using virtual or physical controls (steering wheels, flight yokes), often with increased control over speed, acceleration http://www.cs.utah.edu/research/areas/ve Saito VR2001 23

Komerska and Ware, 2003 Beckhaus 2005 Maipulation method Eye-in-hand and scene-in-hand viewpoint control metaphors by hand rotate around a target moving /scaling the world often used in visualization Image from Doug Bowman s dissertation 24

Put me there method Target based metaphor: jump to a defined location multimodal displays (speech, gesture) ask to go to named location point to target define location in a exocentric, map or world-inminiature display select from application menu list, enter coordinates stepping into a picture or aperture into a different world Teleporting can be disorienting World in miniature (Stoakley, 1995) Hand centered model of VE for navigation Coupled with handheld scale model (prop) all methods except real locomotion dissociate real from virtual travel Problems with this dissociation conflict between real and physical motion unnatural, may interfere with wayfinding not suitable for AR or wearable applications 25

Which travel mode is best? Real motion is most natural but not always desirable in virtual environments travelling large distances or through outer space etc vehicular travel fatigue Steering, fly, natural locomotion best for search and exploration of new environment Map and manipulation are efficient for relative motion Natural Motion Interest in extending the range of real physical motion Hi-ball and other long range trackers vision based AR trackers & wearable systems GPS, landmark based York Trike project Application or System Control Changing state of simulation Timeline, replay events Modify environment Change behaviours Annotate Interact with underlying system Develop and debug Communicating with other users or agents Controlling the behaviour of the simulation or of agents in the world 26

Traditional 2D menus Floating Embedded (better presence?) 3D Windows 1D Ring Menus Other 3D metaphors Can interfere with visualizing VE www.vrac.iastate.edu Hot keys, specialised interface 2D interaction on surface in VE (e.g. graphic tablet represented in VE) Speech recognition Gestures Implicit controls Generally less natural fit to VR interaction model 27