VE Input Devices. Doug Bowman Virginia Tech
|
|
- Alexandra Fletcher
- 6 years ago
- Views:
Transcription
1 VE Input Devices Doug Bowman Virginia Tech
2 Goals and Motivation Provide practical introduction to the input devices used in VEs Examine common and state of the art input devices look for general trends spark creativity Advantages and disadvantages Discuss how different input devices affect interface design (C) 2006 Doug Bowman, Virginia Tech 2 In this lecture we will discuss the various input and output devices that are used in 3D user interfaces and virtual environment applications.
3 Input devices Hardware that allows the user to communicate with the system Input device vs. interaction technique Single device can implement many ITs (C) 2006 Doug Bowman, Virginia Tech 3
4 Human-computer interface System Software User interface software ITs Input devices Output devices User (C) 2006 Doug Bowman, Virginia Tech 4
5 Human-VE interface Env. model Tracking system Simulation loop: -render -check for events -respond to events -iterate simulation -get new tracker data Display(s) Input device(s) (C) 2006 Doug Bowman, Virginia Tech 5
6 Input device characteristics Degrees of Freedom (DOFs) & DOF composition (integral vs. separable) Type of electronics: Digital vs. analog Range of reported values: discrete/continuous/hybrid Data type of reported values: Boolean vs. integer vs. floating point (C) 2006 Doug Bowman, Virginia Tech 6
7 More input device characteristics User action required: active/passive/hybrid Method of providing information: push vs. pull Intended use: locator, valuator, choice, Frame of reference: relative vs. absolute Properties sensed: position, motion, force, (C) 2006 Doug Bowman, Virginia Tech 7
8 Practical classification system Desktop devices Tracking devices 3D mice Special-purpose devices Direct human input (C) 2006 Doug Bowman, Virginia Tech 8
9 Desktop devices: keyboards Chord keyboards 1 Arm-mounted keyboards 2 Soft keyboards (logical devices) (C) 2006 Doug Bowman, Virginia Tech 9
10 Desktop devices: 6-DOF devices 6 DOFs without tracking Often isometric Exs: SpaceBall, SpaceMouse, SpaceOrb (C) 2006 Doug Bowman, Virginia Tech 10
11 Tracking devices: position trackers Measure position and/or orientation of a sensor Degrees of freedom (DOFs) Most VEs track the head motion parallax natural viewing (C) 2006 Doug Bowman, Virginia Tech 11
12 Other uses for trackers Track hands, feet, etc. whole body interaction motion capture application Correspondence between physical/virtual objects Props 5,6 spatial input devices (C) 2006 Doug Bowman, Virginia Tech 12
13 Tracking physical objects (props) (C) 2006 Doug Bowman, Virginia Tech 13 13
14 Electromagnetic trackers Exs: Polhemus Fastrak, Ascension Flock of Birds Most common (?) Transmitter Receiver(s) Noisy Affected by metal (C) 2006 Doug Bowman, Virginia Tech 14
15 Optical/vision-based trackers Exs: Vicon, HiBall, ARToolkit Advantages accurate can capture a large volume allow for untethered tracking Disadvantages image processing techniques occlusion problem (C) 2006 Doug Bowman, Virginia Tech 15
16 Inertial trackers Exs: Intersense IS-300, Intertrax2 Less noise, lag Drift problem Only 3 DOFs (orientation) (C) 2006 Doug Bowman, Virginia Tech 16
17 Hybrid tracking Ex: IS-600 / 900 inertial (orient.) acoustic (pos.) additional complexity, cost (C) 2006 Doug Bowman, Virginia Tech 17
18 Tracking devices: eye tracking (C) 2006 Doug Bowman, Virginia Tech 18 Eye tracking systems provide applications with knowledge of the user s gaze direction. This information opens the door to a number of interesting interaction techniques such as eye directed selection and manipulation. The figure on the left shows the Eyegaze system, a non-intrusive approach which uses an infra-red source that reflects off of the pupil, developed by LC Technologies. The figure on the right shows iview, a head-mounted eye tracking device developed by SensoMotoric Instruments. References:
19 Tracking devices: bend-sensing gloves CyberGlove 7, 5DT Reports hand posture Gesture: single posture series of postures posture(s) + location or motion (C) 2006 Doug Bowman, Virginia Tech 19
20 Tracking devices: pinch gloves Conductive cloth at fingertips Any gesture of 2 to 10 fingers, plus combinations of gestures > 115,000 gestures (C) 2006 Doug Bowman, Virginia Tech 20
21 Case study: Pinch Gloves Pinch gloves are designed to be a combination device (add a position tracker) Very little has been done with Pinch Gloves in VEs - usually 1 or 2 gestures for: Object selection Tool selection Travel (C) 2006 Doug Bowman, Virginia Tech 21
22 Characteristics of Pinch Gloves Relatively low cost Very light User s hand becomes the device User s hand posture can change Allow two-handed interaction Huge number of possible gestures (C) 2006 Doug Bowman, Virginia Tech 22
23 Characteristics of Pinch Gloves II Much more reliable than data gloves Support eyes-off input Can diminish Heisenberg effect Support context-sensitive gesture interpretation (C) 2006 Doug Bowman, Virginia Tech 23
24 Pinch Gloves in SmartScene 13 Lots of two-handed gestures Scale world Rotate world Travel by grabbing the air Menu selection (C) 2006 Doug Bowman, Virginia Tech 24
25 Pinch Gloves for menus TULIP system 14 ND hand selects menu, D hand selects item within menu Limited to comfortable gestures Visual feedback on virtual hands rapmenu (C) 2006 Doug Bowman, Virginia Tech 25
26 Pinch Gloves for text input Pinch Keyboard 14 Emulate QWERTY Pinch finger to thumb to type letter under that finger Move/rotate hands to change active letters Visual feedback (C) 2006 Doug Bowman, Virginia Tech 26
27 3D mice Ring Mouse Fly Mouse Wand Cubic Mouse Dragonfly (C) 2006 Doug Bowman, Virginia Tech 27 The Ring Mouse (top picture) is a small device worn on the user s finger which uses ultrasonic tracking. It also has two buttons for generating discrete events. The main advantages of this device is that it is wireless and inexpensive. The Fly Mouse is a 3D mouse that also uses ultrasonic tracking. This device has five buttons instead of two and also can be used as a microphone. The Cubic Mouse (shown in the figure on the right) is an input device developed at GMD that allows users to intuitively specify three-dimensional coordinates in graphics applications. The device consists of a box with three perpendicular rods passing through the center and buttons for additional input.
28 Special-purpose devices: using conductive cloth Virtual toolbelt Used to select virtual tools Good use of proprioceptive cues Interaction slippers 3 Step on displayed options Click heels to go home (C) 2006 Doug Bowman, Virginia Tech 28
29 Special-purpose devices: Painting Table4 (C) 2006 Doug Bowman, Virginia Tech 29 The Painting Table is another example of a special-purpose input device that is used in the CavePainting application, a system for painting 3D scenes in a virtual environment. The device uses a set of conductive cloth contacts as well as traditional buttons and digital sliders. Users can dip the paint brush prop into the colored cups to change brush strokes. The bucket is used to throw paint around the virtual canvas. References: Keefe, D., Acevedo, D., Moscovich, T., Laidlaw, D., and LaViola, J. CavePainting: A Fully Immersive 3D Artistic Medium and Interactive Experience, Proceedings of the 2001 Symposium on Interactive 3D Graphics, 8593,
30 Special-purpose devices: ShapeTape 11 (C) 2006 Doug Bowman, Virginia Tech 30 ShapeTape is a continuous bend and twist sensitive strip which encourages twohanded manipulation. A BAT is attached and the tool (shown in the figure on the right) is used for creating and editing curves and surfaces along with cameral control and command access. ShapeTape senses bend and twist with two fiber optic sensors at 6cm intervals. References: Balakrishnan, Ravin, George Fitzmaurice, Gordon Kurtenbach, and Karan Singh. Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip Proceedings of the 1999 Symposium on Interactive 3D Graphics, , 1999.
31 Human input: speech Frees hands Allows multimodal input No special hardware Specialized software Issues: recognition, ambient noise, training, false positives, (C) 2006 Doug Bowman, Virginia Tech 31
32 Human input: Bioelectric Control (C) 2006 Doug Bowman, Virginia Tech 32 A recent development at NASA Ames Research Center is a bioelectric input device which reads muscle nerve signals emanating from the forearm. These nerve signals are captured by a dry electrode array on the arm. The nerve signals are analyzed using pattern recognition software and then routed through a computer to issue relevant interface commands. The figure on the left shows a user entering numbers on a virtual numeric keypad while the figure on the right shows a user controlling a virtual 757 aircraft. References: Jorgensen, Charles, Kevin Wheeler, and Slawomir Stepniewski. Bioelectric Control of a 757 Class High Fidelity Aircraft Simulation,
33 Human input: Body Sensing Devices (C) 2006 Doug Bowman, Virginia Tech 33 The MIT Media Lab s affective computing group has developed a Prototype Physiological Sensing System which includes a Galvanic Skin Response sensor, a Blood Volume Pulse sensor, a Respiration sensor, and an Electromyogram. By using this prototype, interface developers can monitor a user s emotional state to dynamically modify an application s interface to better fit the user s needs. References:
34 More human input Breathing device - OSMOSE Brain-body actuated control muscle movements thoughts! (C) 2006 Doug Bowman, Virginia Tech 34
35 Locomotion devices Treadmills Stationary cycles VMC / magic carpet Walking/flying simulations (use trackers) (C) 2006 Doug Bowman, Virginia Tech 35
36 UNIPORT First Locomotion Device For U.S. Army (1994) Proof-of-concept demonstration Developed in six weeks Difficult to change direction of travel Small motions such as side-stepping are impossible (C) 2006 Doug Bowman, Virginia Tech 36
37 Treadport Developed in 1995 Based on a standard treadmill with the user being monitored and constrained by mechanical attachment to the user s waist User actually walks or jogs instead of pedaling Physical movement is constrained to one direction (C) 2006 Doug Bowman, Virginia Tech 37
38 Individual Soldier Mobility Simulator (Biport) Most sophisticated locomotion device Designed for the conduct of locomotion studies Hydraulic-based locomotion driven w/ force sensors at the feet Safeguards limited responsiveness Too awkward to operate (C) 2006 Doug Bowman, Virginia Tech 38
39 Omni-Directional Treadmill 15,16 Most recently developed locomotion device for U.S. Army Revolutionary device that enables bipedal locomotion in any direction of travel Consists of two perpendicular treadmills Two fundamental types of movement User initiated movement System initiated movement (C) 2006 Doug Bowman, Virginia Tech 39
40 Torus treadmill (C) 2006 Doug Bowman, Virginia Tech 40
41 ODT video (C) 2006 Doug Bowman, Virginia Tech 41
42 Virtual Motion Controller 17 Weight sensors in platform sense user s position over platform Step in direction to move that direction Step further to go faster (C) 2006 Doug Bowman, Virginia Tech 42
43 Walking in place 18,19 Analyze tracker information from head, body, feet Neural network (Slater) GAITER project (Templeman) Shown to be better than purely virtual movement, but worse than real walking 20 (C) 2006 Doug Bowman, Virginia Tech 43
44 Classification of locomotion devices/techniques Virtual motion Virtual turning Desktop VEs Vehicle simulators CAVE wand Real turning Most HMD systems Walking in place VMC Real motion Stationary cycles Treadport Biport Wide-area tracking UNIPORT ODT (C) 2006 Doug Bowman, Virginia Tech 44
45 Input and output with a single device Classic example - touch screen LCD tablets or PDAs with pen-based input Phantom haptic device FEELEX haptic device 21 (C) 2006 Doug Bowman, Virginia Tech 45
46 PDA as ideal VE device? 22 Offers both input and output Has on-board memory Wireless communication Portable, light, robust Allows text / number input Can be tracked to allow spatial input (C) 2006 Doug Bowman, Virginia Tech 46
47 Conclusions When choosing a device, consider: Cost Generality DOFs Ergonomics / human factors Typical scenarios of use Output devices Interaction techniques (C) 2006 Doug Bowman, Virginia Tech 47
48 Acknowledgments Joe LaViola, Brown University, for slides and discussions Ron Spencer, presentation on locomotion devices used by the Army (C) 2006 Doug Bowman, Virginia Tech 48
49 References [1] Matias, E., MacKenzie, I., & Buxton, W. (1993). Half-QWERTY: A One-handed Keyboard Facilitating Skill Transfer from QWERTY. Proceedings of ACM INTERCHI, [2] Thomas, B., Tyerman, S., & Grimmer, K. (1998). Evaluation of Text Input Mechanisms for Wearable Computers. Virtual Reality: Research, Development, and Applications, 3, [3] LaViola, J., Acevedo, D., Keefe, D., & Zeleznik, R. (2001). Hands-Free Multi-Scale Navigation in Virtual Environments. Proceedings of ACM Symposium on Interactive 3D Graphics, Research Triangle Park, North Carolina, [4] Keefe, D., Feliz, D., Moscovich, T., Laidlaw, D., & LaViola, J. (2001). CavePainting: A Fully Immersive 3D Artistic Medium and Interactive Experience. Proceedings of ACM Symposium on Interactive 3D Graphics, Research Triangle Park, North Carolina, [5] Bowman, D., Wineman, J., Hodges, L., & Allison, D. (1998). Designing Animal Habitats Within an Immersive VE. IEEE Computer Graphics & Applications, 18(5), [6] Hinckley, K., Pausch, R., Goble, J., & Kassell, N. (1994). Passive Real-World Interface Props for Neurosurgical Visualization. Proceedings of CHI: Human Factors in Computing Systems, [7] Kessler, G., Hodges, L., & Walker, N. (1995). Evaluation of the CyberGlove(TM) as a Whole Hand Input Device. ACM Transactions on Computer-Human Interaction, 2(4), [8] LaViola, J., & Zeleznik, R. (1999). Flex and Pinch: A Case Study of Whole-Hand Input Design for Virtual Environment Interaction. Proceedings of the International Conference on Computer Graphics and Imaging, [9] Ware, C., & Jessome, D. (1988). Using the Bat: a Six-Dimensional Mouse for Object Placement. IEEE Computer Graphics and Applications, 8(6), [10] Zeleznik, R. C., Herndon, K. P., Robbins, D. C., Huang, N., Meyer, T., Parker, N., & Hughes, J. F. (1993). An Interactive 3D Toolkit for Constructing 3D Widgets. Proceedings of ACM SIGGRAPH, Anaheim, CA, USA, (C) 2006 Doug Bowman, Virginia Tech 49
50 References (2) [11] Balakrishnan, R., Fitzmaurice, G., Kurtenbach, G., & Singh, K. (1999). Exploring Interactive Curve and Surface Manipulation Using a Bend and Twist Sensitive Input Strip. Proceedings of the ACM Symposium on Interactive 3D Graphics, [12] Froehlich, B., & Plate, J. (2000). The Cubic Mouse: A New Device for Three-Dimensional Input. Proceedings of ACM CHI. [13] Mapes, D., & Moshell, J. (1995). A Two-Handed Interface for Object Manipulation in Virtual Environments. Presence: Teleoperators and Virtual Environments, 4(4), [14] Bowman, D., Wingrave, C., Campbell, J., & Ly, V. (2001). Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments. Proceedings of HCI International, New Orleans, Louisiana. [15] Darken, R., Cockayne, W., & Carmein, D. (1997). The Omni-directional Treadmill: A Locomotion Device for Virtual Worlds. Proceedings of ACM Symposium on User Interface Software and Technology, [16] Iwata, H. (1999). Walking About Virtual Environments on an Infinite Floor. Proceedings of IEEE Virtual Reality, Houston, Texas, [17] Wells, M., Peterson, B., & Aten, J. (1996). The Virtual Motion Controller: A Sufficient-Motion Walking Simulator. Proceedings of IEEE Virtual Reality Annual International Symposium, 1-8. [18] Slater, M., Usoh, M., & Steed, A. (1995). Taking Steps: The Influence of a Walking Technique on Presence in Virtual Reality. ACM Transactions on Computer-Human Interaction, 2(3), [19] Slater, M., Steed, A., & Usoh, M. (1995). The Virtual Treadmill: A Naturalistic Metaphor for Navigation in Immersive Virtual Environments, Virtual Environments '95: Selected Papers of the Eurographics Workshops (pp ). New York: SpringerWien. [20] Usoh, M., Arthur, K., Whitton, M., Bastos, R., Steed, A., Slater, M., & Brooks, F. (1999). Walking > Walking-in- Place > Flying, in Virtual Environments. Proceedings of ACM SIGGRAPH, (C) 2006 Doug Bowman, Virginia Tech 50
51 References (3) [21] Iwata, H., Yano, H., Nakaizumi, F., & Kawamura, R. (2001). Project FEELEX: adding haptic surface to graphics. Proceedings of ACM SIGGRAPH, Los Angeles, [22] Watsen, K., Darken, R., & Capps, M. (1999). A Handheld Computer as an Interaction Device to a Virtual Environment. Proceedings of the Third Immersive Projection Technology Workshop. [23] Zhai, S. (1998). User Performance in Relation to 3D Input Device Design. Computer Graphics, 32(4), (C) 2006 Doug Bowman, Virginia Tech 51
3D UIs 101 Doug Bowman
3D UIs 101 Doug Bowman Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses The Wii Remote and You 3D UI and
More informationVR System Input & Tracking
Human-Computer Interface VR System Input & Tracking 071011-1 2017 년가을학기 9/13/2017 박경신 System Software User Interface Software Input Devices Output Devices User Human-Virtual Reality Interface User Monitoring
More informationUsing Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments
Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106)
More informationVEWL: A Framework for Building a Windowing Interface in a Virtual Environment Daniel Larimer and Doug A. Bowman Dept. of Computer Science, Virginia Tech, 660 McBryde, Blacksburg, VA dlarimer@vt.edu, bowman@vt.edu
More informationRealtime 3D Computer Graphics Virtual Reality
Realtime 3D Computer Graphics Virtual Reality Virtual Reality Input Devices Special input devices are required for interaction,navigation and motion tracking (e.g., for depth cue calculation): 1 WIMP:
More informationInteracting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)
Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine) Presentation Working in a virtual world Interaction principles Interaction examples Why VR in the First Place? Direct perception
More informationTouching and Walking: Issues in Haptic Interface
Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Contents Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote Why is it important? Interaction is basic to VEs We defined them as interactive in real-time
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests
More informationCSC 2524, Fall 2017 AR/VR Interaction Interface
CSC 2524, Fall 2017 AR/VR Interaction Interface Karan Singh Adapted from and with thanks to Mark Billinghurst Typical Virtual Reality System HMD User Interface Input Tracking How can we Interact in VR?
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu Motivation The mouse and keyboard
More informationGuidelines for choosing VR Devices from Interaction Techniques
Guidelines for choosing VR Devices from Interaction Techniques Jaime Ramírez Computer Science School Technical University of Madrid Campus de Montegancedo. Boadilla del Monte. Madrid Spain http://decoroso.ls.fi.upm.es
More informationCSE 165: 3D User Interaction. Lecture #11: Travel
CSE 165: 3D User Interaction Lecture #11: Travel 2 Announcements Homework 3 is on-line, due next Friday Media Teaching Lab has Merge VR viewers to borrow for cell phone based VR http://acms.ucsd.edu/students/medialab/equipment
More informationVirtual Environment Interaction Based on Gesture Recognition and Hand Cursor
Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor Chan-Su Lee Kwang-Man Oh Chan-Jong Park VR Center, ETRI 161 Kajong-Dong, Yusong-Gu Taejon, 305-350, KOREA +82-42-860-{5319,
More informationThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems
ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems Wayne Piekarski and Bruce H. Thomas Wearable Computer Laboratory School of Computer and Information Science
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationA HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS
A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS JIAN CHEN Department of Computer Science, Brown University, Providence, RI, USA Abstract. We present a hybrid
More informationChapter 15 Principles for the Design of Performance-oriented Interaction Techniques
Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques Abstract Doug A. Bowman Department of Computer Science Virginia Polytechnic Institute & State University Applications
More informationHands-Free Multi-Scale Navigation in Virtual Environments
Hands-Free Multi-Scale Navigation in Virtual Environments Abstract This paper presents a set of interaction techniques for hands-free multi-scale navigation through virtual environments. We believe that
More informationHeads up interaction: glasgow university multimodal research. Eve Hoggan
Heads up interaction: glasgow university multimodal research Eve Hoggan www.tactons.org multimodal interaction Multimodal Interaction Group Key area of work is Multimodality A more human way to work Not
More informationIMGD 4000 Technical Game Development II Interaction and Immersion
IMGD 4000 Technical Game Development II Interaction and Immersion Robert W. Lindeman Associate Professor Human Interaction in Virtual Environments (HIVE) Lab Department of Computer Science Worcester Polytechnic
More informationPop Through Button Devices for VE Navigation and Interaction
Pop Through Button Devices for VE Navigation and Interaction Robert C. Zeleznik Joseph J. LaViola Jr. Daniel Acevedo Feliz Daniel F. Keefe Brown University Technology Center for Advanced Scientific Computing
More information3D Interactions with a Passive Deformable Haptic Glove
3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross
More informationVirtual Environments: Tracking and Interaction
Virtual Environments: Tracking and Interaction Simon Julier Department of Computer Science University College London http://www.cs.ucl.ac.uk/teaching/ve Outline Problem Statement: Models of Interaction
More informationPhysical Presence Palettes in Virtual Spaces
Physical Presence Palettes in Virtual Spaces George Williams Haakon Faste Ian McDowall Mark Bolas Fakespace Inc., Research and Development Group ABSTRACT We have built a hand-held palette for touch-based
More information3D UIs 201 Ernst Kruijff
3D UIs 201 Ernst Kruijff Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses The Wii Remote and You 3D UI
More informationVirtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.
Part 13: Interaction in VR: Navigation Virtuelle Realität Wintersemester 2006/07 Prof. Bernhard Jung Overview Navigation Wayfinding Travel Further information: D. A. Bowman, E. Kruijff, J. J. LaViola,
More informationCosc VR Interaction. Interaction in Virtual Environments
Cosc 4471 Interaction in Virtual Environments VR Interaction In traditional interfaces we need to use interaction metaphors Windows, Mouse, Pointer (WIMP) Limited input degrees of freedom imply modality
More informationVirtual Grasping Using a Data Glove
Virtual Grasping Using a Data Glove By: Rachel Smith Supervised By: Dr. Kay Robbins 3/25/2005 University of Texas at San Antonio Motivation Navigation in 3D worlds is awkward using traditional mouse Direct
More informationInput devices and interaction. Ruth Aylett
Input devices and interaction Ruth Aylett Tracking What is available Devices Gloves, 6 DOF mouse, WiiMote, Kinect Contents Why is it important? Interaction is basic to VEs We defined them as interactive
More informationRV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI
RV - AULA 05 - PSI3502/2018 User Experience, Human Computer Interaction and UI Outline Discuss some general principles of UI (user interface) design followed by an overview of typical interaction tasks
More information3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray
Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User
More information3D Interaction Techniques
3D Interaction Techniques Hannes Interactive Media Systems Group (IMS) Institute of Software Technology and Interactive Systems Based on material by Chris Shaw, derived from Doug Bowman s work Why 3D Interaction?
More informationStereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.
Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays Habib Abi-Rached Thursday 17 February 2005. Objective Mission: Facilitate communication: Bandwidth. Intuitiveness.
More information3D interaction strategies and metaphors
3D interaction strategies and metaphors Ivan Poupyrev Interaction Lab, Sony CSL Ivan Poupyrev, Ph.D. Interaction Lab, Sony CSL E-mail: poup@csl.sony.co.jp WWW: http://www.csl.sony.co.jp/~poup/ Address:
More informationUsing Real Objects for Interaction Tasks in Immersive Virtual Environments
Using Objects for Interaction Tasks in Immersive Virtual Environments Andy Boud, Dr. VR Solutions Pty. Ltd. andyb@vrsolutions.com.au Abstract. The use of immersive virtual environments for industrial applications
More informationUbiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1
Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility
More informationUniversidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs
Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática Interaction in Virtual and Augmented Reality 3DUIs Realidade Virtual e Aumentada 2017/2018 Beatriz Sousa Santos Interaction
More informationOvercoming World in Miniature Limitations by a Scaled and Scrolling WIM
Please see supplementary material on conference DVD. Overcoming World in Miniature Limitations by a Scaled and Scrolling WIM Chadwick A. Wingrave, Yonca Haciahmetoglu, Doug A. Bowman Department of Computer
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More information3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.
CS-525U: 3D User Interaction Intro to 3D UI Robert W. Lindeman Worcester Polytechnic Institute Department of Computer Science gogo@wpi.edu Why Study 3D UI? Relevant to real-world tasks Can use familiarity
More informationHaptic and Locomotion Interfaces
Elective in Robotics Haptic and Locomotion Interfaces Prof. Alessandro De Luca Elective in Robotics Haptic and Locomotion Interfaces 1 Haptic and Locomotion interfaces Haptic interfaces refers to interfaces
More informationHand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments
Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments Robert W. Lindeman John L. Sibert James K. Hahn Institute for Computer Graphics The George Washington University, Washington,
More informationVorlesung Mensch-Maschine-Interaktion. The solution space. Chapter 4 Analyzing the Requirements and Understanding the Design Space
Vorlesung Mensch-Maschine-Interaktion LFE Medieninformatik Ludwig-Maximilians-Universität München http://www.hcilab.org/albrecht/ Chapter 4 3.7 Design Space for Input/Output Slide 2 The solution space
More informationUsing Hybrid Reality to Explore Scientific Exploration Scenarios
Using Hybrid Reality to Explore Scientific Exploration Scenarios EVA Technology Workshop 2017 Kelsey Young Exploration Scientist NASA Hybrid Reality Lab - Background Combines real-time photo-realistic
More informationLook-That-There: Exploiting Gaze in Virtual Reality Interactions
Look-That-There: Exploiting Gaze in Virtual Reality Interactions Robert C. Zeleznik Andrew S. Forsberg Brown University, Providence, RI {bcz,asf,schulze}@cs.brown.edu Jürgen P. Schulze Abstract We present
More informationA Multimodal Locomotion User Interface for Immersive Geospatial Information Systems
F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science,
More informationThe Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract
The Visual Cliff Revisited: A Virtual Presence Study on Locomotion 1-Martin Usoh, 2-Kevin Arthur, 2-Mary Whitton, 2-Rui Bastos, 1-Anthony Steed, 2-Fred Brooks, 1-Mel Slater 1-Department of Computer Science
More informationCSE 165: 3D User Interaction. Lecture #14: 3D UI Design
CSE 165: 3D User Interaction Lecture #14: 3D UI Design 2 Announcements Homework 3 due tomorrow 2pm Monday: midterm discussion Next Thursday: midterm exam 3D UI Design Strategies 3 4 Thus far 3DUI hardware
More informationTRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN
Vol. 2, No. 2, pp. 151-161 ISSN: 1646-3692 TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH Nicoletta Adamo-Villani and David Jones Purdue University, Department of Computer Graphics
More informationA Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments
Virtual Environments 1 A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based Virtual Environments Changming He, Andrew Lewis, and Jun Jo Griffith University, School of
More informationWorking in a Virtual World: Interaction Techniques Used in the Chapel Hill Immersive Modeling Program
Working in a Virtual World: Interaction Techniques Used in the Chapel Hill Immersive Modeling Program Mark R. Mine Department of Computer Science University of North Carolina Chapel Hill, NC 27599-3175
More informationVR based HCI Techniques & Application. November 29, 2002
VR based HCI Techniques & Application November 29, 2002 stefan.seipel@hci.uu.se What is Virtual Reality? Coates (1992): Virtual Reality is electronic simulations of environments experienced via head mounted
More informationApplication of 3D Terrain Representation System for Highway Landscape Design
Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented
More informationAre Existing Metaphors in Virtual Environments Suitable for Haptic Interaction
Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction Joan De Boeck Chris Raymaekers Karin Coninx Limburgs Universitair Centrum Expertise centre for Digital Media (EDM) Universitaire
More informationDesigning Explicit Numeric Input Interfaces for Immersive Virtual Environments
Designing Explicit Numeric Input Interfaces for Immersive Virtual Environments Jian Chen Doug A. Bowman Chadwick A. Wingrave John F. Lucas Department of Computer Science and Center for Human-Computer Interaction
More informationMobile Haptic Interaction with Extended Real or Virtual Environments
Mobile Haptic Interaction with Extended Real or Virtual Environments Norbert Nitzsche Uwe D. Hanebeck Giinther Schmidt Institute of Automatic Control Engineering Technische Universitat Miinchen, 80290
More informationA FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS
A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS Patrick Rößler, Frederik Beutler, and Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute of Computer Science and
More informationHaptic, vestibular and other physical input/output devices
Human Touch Sensing - recap Haptic, vestibular and other physical input/output devices SGN-5406 Virtual Reality Autumn 2007 ismo.rakkolainen@tut.fi The human sensitive areas for touch: Hand, face Many
More informationCOMS W4172 Travel 2 Steven Feiner Department of Computer Science Columbia University New York, NY 10027 www.cs.columbia.edu/graphics/courses/csw4172 April 3, 2018 1 Physical Locomotion Walking Simulators
More informationA Brief Survey of HCI Technology. Lecture #3
A Brief Survey of HCI Technology Lecture #3 Agenda Evolution of HCI Technology Computer side Human side Scope of HCI 2 HCI: Historical Perspective Primitive age Charles Babbage s computer Punch card Command
More informationPractical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius
Practical Data Visualization and Virtual Reality Virtual Reality VR Display Systems Karljohan Lundin Palmerius Synopsis Virtual Reality basics Common display systems Visual modality Sound modality Interaction
More informationPanel: Lessons from IEEE Virtual Reality
Panel: Lessons from IEEE Virtual Reality Doug Bowman, PhD Professor. Virginia Tech, USA Anthony Steed, PhD Professor. University College London, UK Evan Suma, PhD Research Assistant Professor. University
More informationVR/AR Concepts in Architecture And Available Tools
VR/AR Concepts in Architecture And Available Tools Peter Kán Interactive Media Systems Group Institute of Software Technology and Interactive Systems TU Wien Outline 1. What can you do with virtual reality
More informationJane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute
Jane Li Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute Use an example to explain what is admittance control? You may refer to exoskeleton
More informationProprioception & force sensing
Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka
More informationNew Directions in 3D User Interfaces
New Directions in 3D User Interfaces Doug A. Bowman 1, Jian Chen, Chadwick A. Wingrave, John Lucas, Andrew Ray, Nicholas F. Polys, Qing Li, Yonca Haciahmetoglu, Ji-Sun Kim, Seonho Kim, Robert Boehringer,
More informationThe Effects of Finger-Walking in Place (FWIP) for Spatial Knowledge Acquisition in Virtual Environments
The Effects of Finger-Walking in Place (FWIP) for Spatial Knowledge Acquisition in Virtual Environments Ji-Sun Kim 1,,DenisGračanin 1,,Krešimir Matković 2,, and Francis Quek 1, 1 Virginia Tech, Blacksburg,
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationEvaluating Visual/Motor Co-location in Fish-Tank Virtual Reality
Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality Robert J. Teather, Robert S. Allison, Wolfgang Stuerzlinger Department of Computer Science & Engineering York University Toronto, Canada
More informationInteraction in VR: Manipulation
Part 8: Interaction in VR: Manipulation Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Control Methods Selection Techniques Manipulation Techniques Taxonomy Further reading: D.
More informationTestbed Evaluation of Virtual Environment Interaction Techniques
Testbed Evaluation of Virtual Environment Interaction Techniques Doug A. Bowman Department of Computer Science (0106) Virginia Polytechnic & State University Blacksburg, VA 24061 USA (540) 231-7537 bowman@vt.edu
More information3D Sketching Using Interactive Fabric for Tangible and Bimanual Input
3D Sketching Using Interactive Fabric for Tangible and Bimanual Input Anamary Leal Center for Human-Computer Interaction, Virginia Tech Doug Bowman Center for Human-Computer Interaction, Virginia Tech
More informationThe architectural walkthrough one of the earliest
Editors: Michael R. Macedonia and Lawrence J. Rosenblum Designing Animal Habitats within an Immersive VE The architectural walkthrough one of the earliest virtual environment (VE) applications is still
More informationAUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT
AUTOMATIC SPEED CONTROL FOR NAVIGATION IN 3D VIRTUAL ENVIRONMENT DOMOKOS M. PAPOI A THESIS SUBMITTED TO THE FACULTY OF GRADUATE STUDIES IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER
More informationInteraction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application
Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology
More informationEffects of Handling Real Objects and Self-Avatar Fidelity on Cognitive. Task Performance and Sense of Presence in Virtual Environments
Effects of Handling Real Objects and Self-Avatar Fidelity on Cognitive Task Performance and Sense of Presence in Virtual Environments Benjamin Lok, University of Florida Samir Naik, Disney Imagineering
More informationImmersive Real Acting Space with Gesture Tracking Sensors
, pp.1-6 http://dx.doi.org/10.14257/astl.2013.39.01 Immersive Real Acting Space with Gesture Tracking Sensors Yoon-Seok Choi 1, Soonchul Jung 2, Jin-Sung Choi 3, Bon-Ki Koo 4 and Won-Hyung Lee 1* 1,2,3,4
More informationEffects of Handling Real Objects and Self-Avatar Fidelity On Cognitive Task Performance in Virtual Environments
Effects of Handling Real Objects and Self-Avatar Fidelity On Cognitive Task Performance in Virtual Environments Benjamin Lok University of North Carolina at Charlotte bclok@cs.uncc.edu Samir Naik, Mary
More informationFly Over, a 3D Interaction Technique for Navigation in Virtual Environments Independent from Tracking Devices
Author manuscript, published in "10th International Conference on Virtual Reality (VRIC 2008), Laval : France (2008)" Fly Over, a 3D Interaction Technique for Navigation in Virtual Environments Independent
More informationA Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System
FOR U M Short Papers A Comparison of the Accuracy of an Electromagnetic and a Hybrid Ultrasound-Inertia Position Tracking System Abstract Results of a comparison study of the tracking accuracy of two commercially
More informationRéalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury
Réalité Virtuelle et Interactions Interaction 3D Année 2016-2017 / 5 Info à Polytech Paris-Sud Cédric Fleury (cedric.fleury@lri.fr) Virtual Reality Virtual environment (VE) 3D virtual world Simulated by
More informationTRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES
IADIS International Conference Computer Graphics and Visualization 27 TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES Nicoletta Adamo-Villani Purdue University, Department of Computer
More informationI R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:
UNDERGRADUATE REPORT Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool by Walter Miranda Advisor: UG 2006-10 I R INSTITUTE FOR SYSTEMS RESEARCH ISR develops, applies
More informationUser Interface Constraints for Immersive Virtual Environment Applications
User Interface Constraints for Immersive Virtual Environment Applications Doug A. Bowman and Larry F. Hodges {bowman, hodges}@cc.gatech.edu Graphics, Visualization, and Usability Center College of Computing
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationSimultaneous Object Manipulation in Cooperative Virtual Environments
1 Simultaneous Object Manipulation in Cooperative Virtual Environments Abstract Cooperative manipulation refers to the simultaneous manipulation of a virtual object by multiple users in an immersive virtual
More informationEliminating Design and Execute Modes from Virtual Environment Authoring Systems
Eliminating Design and Execute Modes from Virtual Environment Authoring Systems Gary Marsden & Shih-min Yang Department of Computer Science, University of Cape Town, Cape Town, South Africa Email: gaz@cs.uct.ac.za,
More informationIssues and Challenges of 3D User Interfaces: Effects of Distraction
Issues and Challenges of 3D User Interfaces: Effects of Distraction Leslie Klein kleinl@in.tum.de In time critical tasks like when driving a car or in emergency management, 3D user interfaces provide an
More informationBuilding a bimanual gesture based 3D user interface for Blender
Modeling by Hand Building a bimanual gesture based 3D user interface for Blender Tatu Harviainen Helsinki University of Technology Telecommunications Software and Multimedia Laboratory Content 1. Background
More informationTracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye
Tracking Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye Outline of this talk Introduction: what makes a good tracking system? Example hardware and their tradeoffs Taxonomy of tasks:
More informationA Method for Quantifying the Benefits of Immersion Using the CAVE
A Method for Quantifying the Benefits of Immersion Using the CAVE Abstract Immersive virtual environments (VEs) have often been described as a technology looking for an application. Part of the reluctance
More informationAural and Haptic Displays
Teil 5: Aural and Haptic Displays Virtuelle Realität Wintersemester 2007/08 Prof. Bernhard Jung Overview Aural Displays Haptic Displays Further information: The Haptics Community Web Site: http://haptic.mech.northwestern.edu/
More informationMagic Lenses and Two-Handed Interaction
Magic Lenses and Two-Handed Interaction Spot the difference between these examples and GUIs A student turns a page of a book while taking notes A driver changes gears while steering a car A recording engineer
More informationShort Course on Computational Illumination
Short Course on Computational Illumination University of Tampere August 9/10, 2012 Matthew Turk Computer Science Department and Media Arts and Technology Program University of California, Santa Barbara
More informationChapter 1 - Introduction
1 "We all agree that your theory is crazy, but is it crazy enough?" Niels Bohr (1885-1962) Chapter 1 - Introduction Augmented reality (AR) is the registration of projected computer-generated images over
More informationEnabling Cursor Control Using on Pinch Gesture Recognition
Enabling Cursor Control Using on Pinch Gesture Recognition Benjamin Baldus Debra Lauterbach Juan Lizarraga October 5, 2007 Abstract In this project we expect to develop a machine-user interface based on
More informationMOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION
1 MOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION Category: Research Format: Traditional Print Paper ABSTRACT Manipulation in immersive virtual environments
More information20th Century 3DUI Bib: Annotated Bibliography of 3D User Interfaces of the 20th Century
20th Century 3DUI Bib: Annotated Bibliography of 3D User Interfaces of the 20th Century Compiled by Ivan Poupyrev and Ernst Kruijff, 1999, 2000, 3 rd revision Contributors: Bowman, D., Billinghurst, M.,
More information