Interaction in VR: Manipulation

Similar documents
Guidelines for choosing VR Devices from Interaction Techniques

CSC 2524, Fall 2017 AR/VR Interaction Interface

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

3D Interaction Techniques

Cosc VR Interaction. Interaction in Virtual Environments

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo

3D UIs 101 Doug Bowman

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

3D interaction strategies and metaphors

CSE 165: 3D User Interaction. Lecture #11: Travel

3D interaction techniques in Virtual Reality Applications for Engineering Education

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Collaboration en Réalité Virtuelle

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Direct Manipulation. and Instrumental Interaction. Direct Manipulation 1

Direct Manipulation. and Instrumental Interaction. Direct Manipulation

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Fly Over, a 3D Interaction Technique for Navigation in Virtual Environments Independent from Tracking Devices

Virtual Grasping Using a Data Glove

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques

Simultaneous Object Manipulation in Cooperative Virtual Environments

Testbed Evaluation of Virtual Environment Interaction Techniques

Realtime 3D Computer Graphics Virtual Reality

Geo-Located Content in Virtual and Augmented Reality

NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Aural and Haptic Displays

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Panel: Lessons from IEEE Virtual Reality


Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Towards Grasp Learning in Virtual Humans by Imitation of Virtual Reality Users

General conclusion on the thevalue valueof of two-handed interaction for. 3D interactionfor. conceptual modeling. conceptual modeling

The architectural walkthrough one of the earliest

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems

R (2) Controlling System Application with hands by identifying movements through Camera

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Building a bimanual gesture based 3D user interface for Blender

Tangible User Interfaces

3D Data Navigation via Natural User Interfaces

Using the Non-Dominant Hand for Selection in 3D

Spatial Mechanism Design in Virtual Reality With Networking

Mid-term report - Virtual reality and spatial mobility

Benefits of using haptic devices in textile architecture

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Immersive Guided Tours for Virtual Tourism through 3D City Models

Interface Design V: Beyond the Desktop

EVALUATING 3D INTERACTION TECHNIQUES

A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS

VR System Input & Tracking


Effective Iconography....convey ideas without words; attract attention...

Classifying 3D Input Devices

Advancements in Gesture Recognition Technology

MRT: Mixed-Reality Tabletop

User s handbook Last updated in December 2017

Look-That-There: Exploiting Gaze in Virtual Reality Interactions

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

The Control of Avatar Motion Using Hand Gesture

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

Input devices and interaction. Ruth Aylett

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

FORCE FEEDBACK. Roope Raisamo

Vorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space

Beyond Visual: Shape, Haptics and Actuation in 3D UI

The use of gestures in computer aided design

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

A Kinect-based 3D hand-gesture interface for 3D databases

VR/AR Concepts in Architecture And Available Tools

Withindows: A Framework for Transitional Desktop and Immersive User Interfaces

Omni-Directional Catadioptric Acquisition System

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

CS 315 Intro to Human Computer Interaction (HCI)

User Experience Guidelines

Interactive intuitive mixed-reality interface for Virtual Architecture

Tangible Lenses, Touch & Tilt: 3D Interaction with Multiple Displays

Interaction Techniques in VR Workshop for interactive VR-Technology for On-Orbit Servicing

The Revolve Feature and Assembly Modeling

User Experience Guidelines

2. Introduction to Computer Haptics

Classifying 3D Input Devices

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Jerald, Jason. The VR Book : Human-centered Design for Virtual Reality. First ed. ACM Books ; #8. New York] : [San Rafael, California]: Association

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

Haptic presentation of 3D objects in virtual reality for the visually disabled

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Transcription:

Part 8: Interaction in VR: Manipulation Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Control Methods Selection Techniques Manipulation Techniques Taxonomy Further reading: D. A. Bowman, E. Kruijff, J. J. LaViola, I. Poupyrev. 3D User Interfaces. Addison-Wesley Professional. 2004. Chapetr 5. W. R. Sherman & A. Craig. Understanding Virtual Reality: Interface, Application, and Design. Morgan Kaufmann. 2002. Chapter 6. D. A. Bowman (2002): Principles for the Design of Performance-oriented Interaction Techniques. In K. M. Stanney (Ed): Handbook of Virtual Environments. Lawrence Erlbaum Associates. 2002 1

Why Manipulation? Major method of interaction with physical environments touching, picking, manipulating objects using the hands is a main way for humans to affect their surroundings Major method of interaction with virtual environments alternatives: voice, gaze, gesture Affects the quality of entire 3D interface if user cannot efficiently manipulate objects in the virtual environment, then other high-level tasks cannot be accomplished Design of 3D manipulation techniques is difficult Manipulation Techniques: Design Objectives Design 3D manipulation techniques that conform to used input and output devices that are effective in desired task conditions object distance: within/outside the reach object sizes and shape: small, large, flat objects object density accuracy that allow high user performance and comfort independent of user characteristics such as right- or left handedness, level of expertise, age, that are easy to learn that conform to external constraints e.g. can user move or not (i.e. how much physical space is available)? e.g. price.. 2

Control Methods for Manipulation Tasks Direct User Control Indirect Control Physical Control Virtual Control Agent Control Direct User Control Participant interacts with virtual objects as they would real objects (e.g. grabbing) Generally use gesture recognition to interpret user's actions needs VR gloves Virtual hand model with collision sensors 3

Physical Control Real buttons, switches, etc. By putting the interface in the real world, the user can receive some haptic feedback Controls mounted on prop can act independently (e.g. menus) or in concert (e.g. point + click) with the prop's position Virtual Controls Controls that are manifested entirely in the virtual world... although physical controls may be used to manipulate the virtual controls E.g. virtual representations of buttons, sliders, steering wheels, Allows a limited number of physical controls to be used to interact with a large number of virtual controls like a mouse can control sliders, dials, etc. in 2D GUIs May lose haptic feedback 4

Agent Control Tell another entity to do your bidding Using voice recognition and possibly gesture recognition "move <pointing> that table over <pointing> there" Manipulation Tasks Specific, complex manipulation tasks e.g. sculpting, operating a specific virtual machine, Canonical 3D manipulation tasks select object position, rotate, scale object (change object or environment attributes) Variables in manipulation tasks distance to object object size required object translation (amount, depth manipulation) amount of rotation 5

Selection and Manipulation Selection selecting direction selecting items selecting values Direction selection Direction selection useful for travel control (navigation) useful for item selection (select object in a direction) useful for object manipulation (e.g. specify a desired object position) Methods pointer-directed gaze-directed crosshair-directed torso-directed device-directed coordinate-directed landmark-directed selection 6

Direction selection Pointer-directed selection Familiar to most users from real world communication Requires tracking the hand Put-that-there, R. Bolt, MIT, 1980 Gaze-directed selection What I'm looking at is what I'm interested in Also familiar from real world interaction Requires only that the head be tracked Forces the user to look in the direction they are selecting Direction selection Crosshair-directed Vaguely familiar to many people Requires tracking both the head and the hand Requires use of both the head and the hand Torso-directed Good for indicating direction of travel (very common for self-locomotion in real world) Requires tracking of the torso Landmark-directed selection Direction can be specified relative to some object in the environment e.g. "toward the water tower" Requires some means of indicating the landmark object (e.g. voice). 7

Direction selection Device-directed Uses a physical control E.G. joystick, flystick, steering wheel May be relative to an absolute reference (e.g. north) or to current direction Coordinate-directed May be relative to an absolute reference (e.g. "30 degrees east of north"), or to current direction (e.g. "turn left 45 degrees"). Coordinates must be input by a valueselection technique (e.g. voice). Item Selection contact-select 3D-cursor-select point-to-select aperture-select image-plane techniques select-in-mini-world name-to-select menu-select Virtual hand technique (from Poupyrev et al., 1996) Virtual pointer technique (from Bowman et al, 1997) 8

Item Selection Selectionbycontact ("Virtual Hand") Part of the avatar of the user (or prop held by the user) must come in contact with the desired object. May require a trigger event to select, or may require a trigger event to manipulate, or the contact itself may trigger an event. Feedback is good. May be provided visually, aurally, hapticly + Most natural selection technique Limited area of manipulation Item Selection Go-Go (Extension of simple contactselection with "Virtual Hand") extends the virtual hand's reaching distance if the user's hand is close to body, the mapping between the physical and virtual hands' position is 1-to-1 if the hand is extended beyond a threshold, the mapping becomes nonlinear Mapping function for the Go-Go (Poupyrev, et al. 1996) + seamless 6DOF manipulation in a large range of distances Manipulation range is still limited overshoot with large distances 9

Item Selection Selection by 3D cursor The user controls a 3D cursor by some means (e.g. flown by joystick) Selection is generated when the cursor comes in contact with an object If cursor is attached to a hand held prop, then this is very similar to selection-by-contact Item Selection Selection by pointing (Ray-Casting, Virtual Pointer) Use direction of pointing to indicate an object Similar to contact-selection, but object can be out of reach Familiar to most users from real world experience User only needs to control 2 DOFs Empirically proven to perform well Shape of beam ray (finite, infinite) cone ("flashlight", "aperture") may be visualized or not 10

Item Selection Selection by aperture Improvement of simple ray-casting (can control diameter of selection cone) "Aperture" is usually indicated with fingers of the user Requires tracking of "aperture" (e.g. fingers). Requires tracking of the eye location (i.e. head) Requires knowing which eye the user is using Similar to direction selection using crosshairs + interactive and intuitive object disambiguation - inefficient positioning / rotation Aperture technique (Forsberg, et al. 1996) Item Selection Image-Plane Techniques User selects an object by touching its position on an image plane (e.g. table-top of the responsive workbench, a wall of a cave, ) Select object underneath the users finger (occlusion, "sticky finger") or select object between thumb and index finger Manipulation: manipulate 2D projection or, scale object down & bring within user's reach ("Scaled-World-Grab"; Mine, 1997) sticky finger Pierce et al. 1997 + Easy, intuitive selection - remote object manipulation difficult 11

Item Selection Menu positioning fix head-centered Selection by menu Familiar to most computer users Requires a list of all possible items Items don't have to be objects in the world Requires some means of input to indicate that a menu choice is selected body-centered hand-centered Selection in Miniature World (World-in-Miniature, WIM) A subset of the real world is represented as a small model on a palette exocentric frame of reference i.e. not egocentric as virtual hand, virtual pointer + Allows 6DOF manipulation at any distance - Difficult to precisely manipulate small objects Item Selection Selection by naming Very familiar to users Requires voice recognition Requires that the user know the name of the object Possibly, more generic descriptions "the yellow object in the back" Might be ambiguous 12

Evaluation of Item Selection Techniques: Ray-casting v. Image-plane v. Go-Go Two experimental evaluations Poupyrev et al., 1998, and Bowman et al., 1999 Ray-casting and image-plane are generally more effective then Go-Go Exception: high precision selection, e.g. small or far away objects (about 4 degrees of field of view), can be easier with Go-Go. cf. Fitt's law Bowman et al., 1999 Alphanumeric Value Selection Avoided in most virtual reality interactions but sometimes necessary typically uses some form of physical or virtual control (e.g. dials, keyboard, tablet) May use a menu of pre-selected values May use voice input (S. Conrad, 2007) 13

Guidelines for Designing Selection Techniques Bowman, 2002 Use the natural virtual hand technique if all selection is within arms reach Use ray-casting techniques if speed of remote selection is a requirement Ensure that the chosen selection technique integrates well with the manipulation technique to be used Consider multimodal input for combined selection and command tasks If possible, design the environment to maximize the perceived size of objects Manipulation Once we've selected an object, we may want to manipulate it A variety of operations we may wish to perform: object positioning and scaling exerting force on a virtual object object attribute modification altering state of virtual controls global attribute modification travel controls (see next lecture on Navigation in VEs) 14

Manipulation Object positioning and scaling Changing the shape and position of an object without regard to physics Can use any of the control methods (direct, physical, virtual, agent), but the direct method is the most common Scale and rotation operations also require specification of some origin or axis to operate about Constrained positioning: Snap-to-grid, lock-to-surface, snap-to-object, Exerting force on a virtual object Using a world model with physics Pushing, pulling, supporting, hitting objects, etc. Generally used for worlds that attempt to be more "realistic." Moving objects is done by pushing them, or picking them up and setting them down. Manipulation involves much more than just interaction Example: Constrained Manipulation in Virtual Design / Assembly Simulation 15

Manipulation Object attribute modification Changing the parameters that control how an object is rendered or behaves, add constraints, These operations typically do not mimic the real world (e.g. setting the color of a fence vs. painting it) Altering the state of virtual controls Depending on how the virtual control is designed, this can be viewed as a subset of "Object positioning" or of "Object attribute modification." The changed value of the virtual control may then be used to select, modify or steer some other object in the world. Global attribute modification Similar to specific object attribute modification, but without requiring the object to be selected. e.g. adjusting the overall volume of the world e.g. adjusting the time of day and therefore the ambient light in the world Common manipulation techniques Simple virtual hand HOMER Scaled-world grab World-in-miniature 16

Simple virtual hand technique Attach object to virtual hand, by making object a child of the hand On release, reattach object to world Also applies to Go-Go (and other arm-extension techniques) and ray-casting Root head hand building Root head hand building HOMER technique Bowman, D., & Hodges, L. (1997) Hand-Centered Object Manipulation Extending Ray-Casting Time Hybrid method: Select: ray-casting Manipulate: hand 1.0 m 0.3 m 0.6 m 2.0 m torso physical hand torso physical hand 17

Scaled-world grab technique Mine, M., Brooks, F., & Sequin, C. (1997) Often used with occlusion (imageplane item selection) At selection, scale user up (or world down) so that v. hand is actually touching selected object User doesn t notice a change in the image until he moves At release: Re-attach object to world Scale user down to original size Ensure that eye remains in same position select manipulate scale user Discussion of hybrid manipulation techniques HOMER (Bowman et al., 1997) World-scale grab (Mine et al., 1997) Advantages: Easy selection: ray-casting or image plane 6DOF Manipulation on a wide range of distances Mine: manipulation within normal area of reach Disadvantages Moving objects from within reach to far is problematic Inconsistency in mappings between physical and virtual hands movements 18

World-in-miniature (WIM) technique Stoakley, R., Conway, M., & Pausch, R. (1995) http://www.cs.cmu.edu/~stage3/publications/95/conferences/chi/paper.html Dollhouse world held in user s hand Miniature objects can be manipulated directly Moving miniature objects affects full-scale objects Can also be used for navigation On selection: Determine which full-scale object corresponds to the selected miniature object Attach miniature object to v. hand (w/out moving object) Each frame: Copy local position matrix of miniature object to corresponding full-scale object Guidelines for Designing Manipulation Techniques Bowman, 2002 Reduce the number of degrees of freedom to be manipulated if the application allows it. Provide general or application-specific constraints or manipulation aids. Allow direct manipulation with the virtual hand instead of using a tool (e.g. virtual light ray) Avoid repeated, frequent scaling of the user or the environment Use indirect depth manipulation for increased efficiency and accuracy 19

Taxonomy of Selection and Manipulation Techniques (1/3): Selection Techniques Bowman & Hodges, 1999 Selection Feedback graphical force / tactile audio Indication of Object object touching pointing occlusion/framing indirect selection 2D 3D hand 3D gaze from list voice selection iconic objects Indication to Select gesture button voice command no explicit command Taxonomy of Selection and Manipulation Techniques (2/3): Manipulation Techniques Bowman & Hodges, 1999 Manipulation Object Attachment attach to hand attach to gaze hand moves to object object moves to hand user / object scaling Object Position no control 1-to-N hand to object motion maintain bodyhand relation other hand mappings indirect control Object Orientation no control 1-to-N hand to object rotation other hand mappings indirect control Feedback graphical force / tactile audio 20

Taxonomy of Selection and Manipulation Techniques (3/3): Release Techniques Bowman & Hodges, 1999 Release Indication to drop gesture button voice command Object final location remain in current location adjust position adjust orientation Two-Handed Interfaces Many natural manipulation tasks are two-handed Use of bimanual interfaces not common in VR applications extra costs for second VR glove conventional desktop interfaces commonly use only a single input device (mouse) Many bimanual tasks are asymmetric (Guiard, 1987) non-dominant hand defines a frame of reference, defines a spatial context dominant hand performs fine manipulations Bimanual interfaces allow the simultaneous specification of multiple parameters e.g. grasping an object with two hands implies an rotation axis in single handed interfaces, such actions often require several steps; e.g. specify rotation axis, specify rotation amount Voodoo Dolls technique for manipulation at a distance (Pierce et al. 1999) Prof. B. Jung Virtuelle Realität Tangible TU Interaction Bergakademie with Freiberg Props (Hinkley et al. 1994) 21

Grasp Taxonomy by Cutkosky & Wright Grasp Analysis Low and Mid Level Grasp Features Weber, Heumer & Jung, 2005 Track finger and hand joint angle values with CyberGlove Modified by virtual collision sensors and joint constraints to realistic grasp posture Contact Points of Hand with Object (see Contact Web of Kang/Ikeuchi) Contact Web 22

Grasp Analysis - High Level Grasp Features Weber, Heumer & Jung, 2005 Based on low-level and medium-level features, grasp can be classified w.r.t. taxonomy Grasp Category yields further features: prehensile / non-prehensile, volar / non-volar, power / precision grasp Further reasoning can be performed based on object type and grasp features For instance grasp purpose: displacement or use etc. Basis for AI methods (object specific reasoning, ) Grasp Taxonomy by Schlesinger, 1919 cylindrical hook spherical palmar lateral tip G. Schlesinger Der Mechanische Aufbau der künstlichen Glieder, 1919 Summarized by C.L. Taylor und R.J. Schwarz in The Anatomy of Mechnics of the Human Hand, 1955 6 grasp types: Cylindrical, Tip, Hook, Palmar, Spherical und Lateral few, but elementary classes 23

Grasp Analysis A Machine Learning Approach Heumer, Ben Amor, Weber & Jung, IEEE VR 2007 Direct classification of raw data glove sensor readings w.r.t. Schlesinger taxonomy No calibration of data glove! Evaluation of > 30 classifiers of data mining WEKA, data mining software, Waikato University, New Zealand 6 classifier categories Bayes: probabilistic reasoning Function approximators: learning of functions, e.g. neural nets Lazy Learners Decision Trees Rules: induction of logical rules Meta: combination of simple classifiers to hierarchies or cascades Grasp Analysis A Machine Learning Approach Heumer, Ben Amor, Weber & Jung, IEEE VR 2007 24

Myths and Reality Poupyrev, SIGGRAPH 2000 Myth: Manipulation techniques should strictly imitate real-world manipulation. Reality: Most manipulation techniques depart from real world manipulation to a greater or lesser degree. Myth: We should develop universal manipulation techniques. Reality: There is no one best technique for every condition of immersive manipulation. Myth: Manipulation techniques should be 6DOF. Reality: Constraining DOF of manipulation can be an efficient method of making interaction easier. Myth: To improve interaction, we should design better devices and interaction techniques. Reality: We can design VE so that existing techniques allow for maximum performance. 25