Cosc 4471 Interaction in Virtual Environments VR Interaction In traditional interfaces we need to use interaction metaphors Windows, Mouse, Pointer (WIMP) Limited input degrees of freedom imply modality in the interface Abstractions of behaviours (sliders, dialog boxes, ) Requires a cognitive model of user interface and mapping of goals to user actions Norman (1986) Gulf of execution - the psychological gap that must be crossed to manipulate a user interface appropriately: goals -> plans -> action specification -> operate interface * Gulf of evaluation - the psychological gap that must be crossed to interpret a user interface display: interface -> interpretation -> evaluation. * * (www.usability.com/glossary) 1
VR Interaction Metaphor In a VE we present immersive displays that user perceives track and represent the participant s actions in the VE model environments with a natural behaviour and reaction to participant actions VR Interaction Metaphor Natural and intuitive user interaction believed to promote presence Generalization of WYSIWYG concept More effective? Direct, ideally narrow gulf of execution and gulf of evaluation Can take advantage of natural, everyday skills and immediate and natural feedback 2
VE s enable direct manipulation and interaction with environment as opposed to abstract metaphors In an immersive VE we can have a different interaction model Interactions centered on the user s body ( Egocentric ) Natural interactions possible using the affordances provided by real environments Need for modality reduced constraints of the VE (gravity, collision detection) flexibility of the interface (i.e. no need to separate selection from manipulation) 3
Limitations in VR interaction metaphor Direct metaphor limited Limitations on 3D tracking, models, display Little API support (standards, tools) Physical effort and fatigue Restricted by same constraints as real world Conversely, poor support for real world constraints Increased effort, difficulty with precision, unnatural Poor fit or inefficiency performing some tasks with direct interaction (control and configuration tasks) Most fielded VE s have little interactivity or are difficult to use Can explicitly model and use real world constraints in interaction MIVE (Stuerzlinger) models physical constraints and natural grouping behaviour Surfaces of support Push together metaphors Limitations in VR interaction metaphor Often need magic interaction as well as natural Allows interactions not possible in real world Longer range interaction Teleporting Allows compensation for limited tracking ability and ability of displays to support real world constraints Requires training on metaphor Use in high-fidelity training situations must be limited In non-realistic VE consider the effectiveness of interaction 4
Representing humans in VE interaction Representation of participant s body and actions in the VE Feedback for interaction (egocentric representation of hands, limbs) especially manipulation and selection Improved immersion and presence Automatic with see-through HMD and CAVE Need virtual (graphical) representation in HMD Representation of other s avatars in a collaborative VE Representing humans in VE interaction Direct interaction relies on correspondence between body and its VE representation Temporal and spatial mapping from human sensors and displays, human effectors and system input devices more straightforward in naturalistic VE s Need to track user s head (hand, body ) in real time in order to respond to head (hand, body ) motion in real time Current tracking does not measure degrees of freedom possible in human motion Burns (2005) IEEE VR2005 5
Visually Coupled Systems Most fundamental representation is participant s view (camera control) A special case of locomotion 3D graphics are rendered from the point of view and pose of a virtual camera Full 6-DOF motion possible Complicated mapping to 2D devices such as mice Typical viewpoint options Virtual camera at avatar s eye (egocentric) Virtual camera fixed in the world (exocentric) Tethered virtual camera (trailing user with an over the shoulder view) Natural view control involves egocentric view and head and gaze movements Natural to look around (rotate and translate head to change view) Reduced learning curve, cognitive demand, hands free 6
Head orientation and position Need to track users head pose Ideally 6DOF pose, often 3D orientation is sufficient for distant objects of interest and restricted head motion (i.e. vehicle simulators) Render view appropriate for viewing direction and vantage point McDowall and Bolas, IPT 2002 critical component for immersive displays supports structure from motion perception maneuvering locomotion active perception situational awareness 7
Instantaneous field of view (FOV) of a display is angle subtended by the image at the eye Field of Regard amount of space (in terms of visual angle) filled by the virtual world an increased effective FOV beyond the instantaneous FOV is provided by tracking head motion Calibration and alignment very important in VCS in order to generate appropriate images much more sensitive to relative misregistration between real and synthetic imagery in AR than absolute misalignment in VR distortion and aberration more apparent in see-through displays than in VR displays distortion causes distortion misjudgment of shape and depth from perspective and motion. Lag ( exponential lag ) in VCS Delay in interaction in a CAVE or helmet-mounted display (HMD) can lead to perceptual and performance degradation Dynamic error - temporal mismatch between expected and actual response Display lag arises from transduction, filtering, transmission, image generation, rendering and display 8
Detrimental effects of delay for interaction Manual control Large lags (> 300 ms) force adoption of nonideal control strategies (move and wait) Degraded tracking or pursuit of a target Instability (pilot induced oscillation) Lag restrains head movements Slows hand-eye or head-eye co-ordination Fitt s law: MT = C1 + C2 (C3+lag) ID Visually guided motor behaviour Reduced ability to recalibrate motor behaviour (prism adaptation) Difficulties with grasp and placement and precision manoeuvring (e.g. ring tracing task degraded with 48ms delay) Visual search time increases with latency Depends on task, for tightly coupled visual simulation, maximum latencies of 40 or 80 ms used as rules of thumb (DIS 100 ms) Simulator or cyber sickness and workload Perceptual stability/oscillopsia VE Interactions Selection Manipulation Navigation Wayfinding Travel and Locomotion System or Application Control (Executive functions) 9
Selection Specifying object(s) in set or scene Indicate target of action Make active for manipulation and other operations Destination for travel At a distance necessarily involves tool metaphors or magic Degree of difficulty affected by distance, size, density object selection methods Contact/Virtual Hand Pointing (GoGo, ray casting) 3D cursor Aperture (occlusion, framing) Indirect (menu, context, spoken naming) world in miniature Need to provide feedback Touch, graphical, auditory Abstract (i.e. auditory icon) or realistic (forces, motion) Bowman and Hodge s Taxonomy 10
Direct Selection Virtual hand directly grabs or touches to select Direct, no magic but only limited range Avatar feedback essential in HMD systems GoGo and Reeling Magic extension to virtual hand GoGo: arm stretches to grab distant objects Reeling: Fishing rod metaphor Poupyrev et al, Eurographics 98 Ray Casting and Laser Metaphors Using hand other pointer cast a ray into scene Natural extension of pointing Action at a distance Feedback through extended pointer or laser beam metaphor Object touched or lit up is selected Intersect in 3D space Sensitive to orientation tracking noise and error due to large moment arm 11
University of Delft, Moldrive Poupyrev et al, Eurographics 98 Steinicke, WSCG 2005 direction selection methods (for selection, direction of travel etc) selection by pointing by gaze direction (or head or torso orientation) by crosshair (selection beam) by valuator by coordinate specification by landmarks 12
Image plane selection (Pierce 1997) Head Crusher Sticky Finger Palm lifting Framing Image plane techniques require user to view VE as both 3D environment Magic picture Manipulation manipulation operations positioning and sizing objects pushing/pulling on objects attribute modification (of object/simulation) controlling travel Tightly coupled to selection in a virtual environment A related an important issue is collision detection and management 13
Gruchalla, IEEE VR2004 Direct Metaphors Hand is modeled with limited degrees of freedom, often as a point no posture, grasp Okay for selection how to support manipulation? Grasp with button, pinch glove After grasp object moves with hand 14
University of Delft, Project Moldrive From Slater et al (2002) Computer Graphics and Virtual Environments 15
Kim & Fellner (2005) More complex direction interactions? Track orientation of hand for pose changes Modal manipulation through secondary input device Tracking of finger flexion for grasp Orientation changes at a distance difficult Bring object to user and use hand centred manipulation Manipulation methods (Mine 1995) Photos from Sherman& Craig, Understanding Virtual Reality 2003 16
Exocentric Manipulation Manipulation from a external view (exocentric) rather than first person (egocentric) World in miniature (WIM) Scaled world grab May break presence immersion indirect WIM does not scale easily and is difficult for precise positioning Mark Mine (1995), Scaled world grab Two handed interaction Guiard, Y (1987 The Journal of Motor Behaviour) stresses asymmetry between dominant and non-dominant hand: Motion of the dominant hand uses nondominant as spatial reference Dominant hand performs finer and smaller scale motions Non-dominant hand motion typically precedes dominant hand motion 17
Navigation Two components to navigating way finding travel Wayfinding Refers to determination of current location and path to desired location Need to maneuver through environment and avoid obstacles when wayfinding (or wandering) need to provide cognitive information to support location determination and wayfinding behaviour (navigational awareness) 18
people believed to form cognitive maps or models of the environment landmark procedural survey In VR we need to be able to navigate about the VE. A typical goal of wearable computers is to provide navigational awareness. Pierce& Pausch, Navigation with place representations and visible landmarks IEEE VR2005 Aids to wayfinding landmarks (natural or artificial) path following/leaving trails maps memorable place names compass/instruments exocentric views coordinate display, grid structure constrained travel 19
With non-immersive and small fields of view Exocentric view preferred for global spatial awareness Egocentric view preferred for local navigation Tethered view as a compromise In immersive applications, Egocentric views are the normal mode Exocentric views used for reorientation, large scale travel Travel Current VR systems allow small tracked work spaces To move to other places in the virtual environment (or in the real world) need to support travel between two places Physical travel fits real world Locomotion on foot is natural form of travel Vehicular travel is also familiar direct or indirect control and locomotion 20
Travel through computer based worlds using mice and joystick is familiar to many (computer games etc) Travel through time is also useful Travel supports Exploration Search Maneuvering Travel requires control of direction and speed of motion Steering Locomotion (real time user control of the virtual camera through VE ) Real travel is limited in a VE- we need a travel metaphor Travel Methods and Metaphors Real locomotion modest distances, limited by tracking capabilities some tricks possible (treadmills, walking in place) typically some real locomotion supported in addition to other techniques Difficulties with terrain following, collision detection and response 21
Route planning metaphor Ride along a preplanned route Tow rope or river metaphor Image from Sherman& Craig, Understanding Virtual Reality 2003 Path to be specified can be done in path planning navigation task Waypoints Icons Path drawing Image from Doug Bowman s dissertation Fly mode most common, lots of freedom direction and speed control A walk through is fly mode constrained to follow the terrain. If not other interaction possible we have a walk-through VE 22
Travel Methods and Metaphors Direction Control (Steering) Gaze, head, or torso orientation Hand pointing or other gesture Tracked devices (wands) Two or three-dimensional input devices: Gamepads and joysticks, 3D input devices,wands Speed control: constant velocity, gesture, input device, select from list Can drive vehicles using virtual or physical controls (steering wheels, flight yokes), often with increased control over speed, acceleration http://www.cs.utah.edu/research/areas/ve Saito VR2001 23
Komerska and Ware, 2003 Beckhaus 2005 Maipulation method Eye-in-hand and scene-in-hand viewpoint control metaphors by hand rotate around a target moving /scaling the world often used in visualization Image from Doug Bowman s dissertation 24
Put me there method Target based metaphor: jump to a defined location multimodal displays (speech, gesture) ask to go to named location point to target define location in a exocentric, map or world-inminiature display select from application menu list, enter coordinates stepping into a picture or aperture into a different world Teleporting can be disorienting World in miniature (Stoakley, 1995) Hand centered model of VE for navigation Coupled with handheld scale model (prop) all methods except real locomotion dissociate real from virtual travel Problems with this dissociation conflict between real and physical motion unnatural, may interfere with wayfinding not suitable for AR or wearable applications 25
Which travel mode is best? Real motion is most natural but not always desirable in virtual environments travelling large distances or through outer space etc vehicular travel fatigue Steering, fly, natural locomotion best for search and exploration of new environment Map and manipulation are efficient for relative motion Natural Motion Interest in extending the range of real physical motion Hi-ball and other long range trackers vision based AR trackers & wearable systems GPS, landmark based York Trike project Application or System Control Changing state of simulation Timeline, replay events Modify environment Change behaviours Annotate Interact with underlying system Develop and debug Communicating with other users or agents Controlling the behaviour of the simulation or of agents in the world 26
Traditional 2D menus Floating Embedded (better presence?) 3D Windows 1D Ring Menus Other 3D metaphors Can interfere with visualizing VE www.vrac.iastate.edu Hot keys, specialised interface 2D interaction on surface in VE (e.g. graphic tablet represented in VE) Speech recognition Gestures Implicit controls Generally less natural fit to VR interaction model 27