Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

Similar documents

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Guidelines for choosing VR Devices from Interaction Techniques

CSC 2524, Fall 2017 AR/VR Interaction Interface

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Chapter 15 Principles for the Design of Performance-oriented Interaction Techniques

Testbed Evaluation of Virtual Environment Interaction Techniques

3D UIs 101 Doug Bowman

Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

A HYBRID DIRECT VISUAL EDITING METHOD FOR ARCHITECTURAL MASSING STUDY IN VIRTUAL ENVIRONMENTS

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

The architectural walkthrough one of the earliest

TRAVEL IN IMMERSIVE VIRTUAL LEARNING ENVIRONMENTS: A USER STUDY WITH CHILDREN

Interaction in VR: Manipulation

User Interface Constraints for Immersive Virtual Environment Applications

Direct Manipulation on the Virtual Workbench: Two Hands Aren't Always Better Than One

New Directions in 3D User Interfaces

Cosc VR Interaction. Interaction in Virtual Environments

Hand-Held Windows: Towards Effective 2D Interaction in Immersive Virtual Environments

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Simultaneous Object Manipulation in Cooperative Virtual Environments

VE Input Devices. Doug Bowman Virginia Tech

3D Interaction Techniques

Affordances and Feedback in Nuance-Oriented Interfaces

Interaction Design for Mobile Virtual Reality Daniel Brenners

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

General conclusion on the thevalue valueof of two-handed interaction for. 3D interactionfor. conceptual modeling. conceptual modeling

A Novel Human Computer Interaction Paradigm for Volume Visualization in Projection-Based. Environments

Towards Usable VR: An Empirical Study of User Interfaces for Immersive Virtual Environments

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

Classifying 3D Input Devices

3D Interactions with a Passive Deformable Haptic Glove

CSE 165: 3D User Interaction. Lecture #11: Travel

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

3D interaction strategies and metaphors

Using the Non-Dominant Hand for Selection in 3D

Look-That-There: Exploiting Gaze in Virtual Reality Interactions

Pop Through Button Devices for VE Navigation and Interaction

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

Application and Taxonomy of Through-The-Lens Techniques

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Fly Over, a 3D Interaction Technique for Navigation in Virtual Environments Independent from Tracking Devices

Welcome. My name is Jason Jerald, Co-Founder & Principal Consultant at Next Gen Interactions I m here today to talk about the human side of VR

Chapter 1 - Introduction

MOVING COWS IN SPACE: EXPLOITING PROPRIOCEPTION AS A FRAMEWORK FOR VIRTUAL ENVIRONMENT INTERACTION

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

A new user interface for human-computer interaction in virtual reality environments

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

EyeScope: A 3D Interaction Technique for Accurate Object Selection in Immersive Environments

Designing Explicit Numeric Input Interfaces for Immersive Virtual Environments

Physical Presence Palettes in Virtual Spaces

Empirical Comparisons of Virtual Environment Displays

Interactive and Immersive 3D Visualization for ATC. Matt Cooper Norrköping Visualization and Interaction Studio University of Linköping, Sweden

Eliminating Design and Execute Modes from Virtual Environment Authoring Systems

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

The Effect of 3D Widget Representation and Simulated Surface Constraints on Interaction in Virtual Environments

Building a bimanual gesture based 3D user interface for Blender

Overcoming World in Miniature Limitations by a Scaled and Scrolling WIM

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality

Effective Iconography....convey ideas without words; attract attention...

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

Direct 3D Interaction with Smart Objects

Exploring the Benefits of Immersion in Abstract Information Visualization

The use of gestures in computer aided design

First day quiz Introduction to HCI

Accepted Manuscript (to appear) IEEE 10th Symp. on 3D User Interfaces, March 2015

User Interface Software Projects

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Working in a Virtual World: Interaction Techniques Used in the Chapel Hill Immersive Modeling Program

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Realtime 3D Computer Graphics Virtual Reality

Interface Design V: Beyond the Desktop

20th Century 3DUI Bib: Annotated Bibliography of 3D User Interfaces of the 20th Century

Tangible Lenses, Touch & Tilt: 3D Interaction with Multiple Displays

A Method for Quantifying the Benefits of Immersion Using the CAVE

Gestaltung und Strukturierung virtueller Welten. Bauhaus - Universität Weimar. Research at InfAR. 2ooo

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

A Virtual Learning Environment for Deaf Children: Design and Evaluation

Assessing the Effects of Orientation and Device on (Constrained) 3D Movement Techniques

Interaction Techniques in VR Workshop for interactive VR-Technology for On-Orbit Servicing

Using Hands and Feet to Navigate and Manipulate Spatial Data

User experimentation: An Evaluation of Velocity Control Techniques in Immersive Virtual Environments

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

New Directions in 3D User Interfaces

Virtual Environments: Tracking and Interaction

A Comparative Study of User Performance in a Map-Based Virtual Environment

Virtuelle Realität. Overview. Part 13: Interaction in VR: Navigation. Navigation Wayfinding Travel. Virtuelle Realität. Prof.

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Tangible User Interface for CAVE TM based on Augmented Reality Technique

Mid-term report - Virtual reality and spatial mobility

T(ether): spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation

3D UIs 201 Ernst Kruijff

Transcription:

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments Doug A. Bowman, Chadwick A. Wingrave, Joshua M. Campbell, and Vinh Q. Ly Department of Computer Science (0106) Virginia Tech Blacksburg, VA 24061 USA {bowman, cwingrav, jocampbe, vly}@vt.edu Abstract Usable three-dimensional (3D) interaction techniques are difficult to design, implement, and evaluate. One reason for this is a poor understanding of the advantages and disadvantages of the wide range of 3D input devices, and of the mapping between input devices and interaction techniques. We present an analysis of Pinch Gloves and their use as input devices for virtual environments (VEs). We have developed a number of novel and usable interaction techniques for VEs using the gloves, including a menu system, a technique for text input, and a two-handed navigation technique. User studies have indicated the usability and utility of these techniques. 1 Introduction The number of input devices available for use in three-dimensional (3D) virtual environments (VEs) is large, and still growing. Despite the proliferation of specialized 3D input devices, it is often difficult for the designers of VE applications to choose an appropriate device for the tasks users must perform, and to develop usable interaction techniques using those devices. The difficulties of 3D interaction (Herndon, van Dam, & Gleicher, 1994) have led to guidelines for the design of 3D interfaces (Gabbard, 1997; Kaur, 1999) and techniques for their evaluation (Bowman, Johnson, & Hodges, 1999; Hix et al., 1999), but there is little work on the design and evaluation of 3D input devices, and on the mappings between input devices, interaction techniques, and applications. In our research, we have been designing and evaluating 3D interaction techniques using Fakespace Pinch Gloves, combined with six DOF trackers, as 3D input devices. We had a dual motivation for this work. First, we were dissatisfied with the standard VE input devices (tracked wands, pens, and tablets) because of their poor accuracy and ergonomic problems. Second, we observed that although Pinch Gloves were relatively inexpensive and capable of a very large number of discrete inputs, they were not often used in VE applications, except for a few extremely simple techniques. Our goal is to determine whether the gloves can be used in more complex ways for a variety of interaction tasks in VEs, and to develop guidelines for the appropriate and effective use of these input devices. We begin by summarizing related work, then discuss the characteristics of Pinch Gloves that enable their use for novel 3D interaction techniques. Three specific techniques are presented, using the gloves for menus, travel, and text input. We conclude with some generalizations about interaction using the gloves. 2 Related work There have been a few attempts to organize and understand the design space of input devices for 3D applications, most notably the work of Shumin Zhai (Zhai & Milgram, 1993; Zhai, Milgram, & Buxton, 1996). However, there is still work to be done before we understand how to design interaction techniques that take advantage of the properties of a particular device, and how to choose an appropriate device for a specific application. We are not aware of any empirical studies involving Pinch Gloves, but they have been used for various interaction techniques (Mapes & Moshell, 1995; Pierce, Stearns, & Pausch, 1999) and applications (Cutler, Frohlich, & Hanrahan, 1997). Many types of menus have been developed for use in VEs (Angus & Sowizral, 1995; Jacoby & Ellis, 1992; Mine, 1997; Mine, Brooks, & Sequin, 1997), but none of them simultaneously address the issues of efficient, comfortable, and precise selection of a large number of menu items. Similarly, there are many techniques for 3D travel (viewpoint movement) (Bowman, Koller, & Hodges, 1997; Mine, 1995). We present a technique that takes advantage of two-handed interaction and allows users to control velocity. There is little prior work on text input for VEs; this problem has been largely avoided because of its difficulty. Poupyrev developed a system to allow handwritten input in VEs (Poupyrev, Tomokazu, & Weghorst, 1998), but the input was saved only as digital ink

and not interpreted to allow true text input. Our technique allows intuitive and precise input of actual text, without requiring the user to learn any special key chords. 3 Pinch Gloves as a VE input device Fakespace Pinch Gloves are commercial input devices designed for use in VEs. They consist of flexible cloth gloves augmented with conductive cloth sewn into the tips of each of the fingers (figure 1). When two or more pieces of conductive cloth come into contact with one another, a signal is sent back to the host computer indicating which fingers are being pinched. The gloves also have Velcro on the back of the hand so that a position tracker can be mounted there. They are distinct from whole-hand glove input devices, such as the CyberGlove, which report continuous joint angle information used for gesture or posture recognition. Pinch Gloves can be viewed as a choice device with a very large number of possible choices. Practically, however, the gloves have some interesting characteristics that differentiate them from a set of buttons or switches. The most obvious difference between Pinch Gloves and other choice devices is the huge number of possible pinch gestures allowed by the gloves gestures involving any combination of two to ten fingers all touching one another, plus gestures involving multiple separate but simultaneous pinches (for example, left thumb and index finger and right thumb and index finger pinched separately, but at the same time). This large gesture space alone, however, is not sufficient for usability. Arbitrary combinations of fingers can be mapped to any command, but those gestures will not necessarily provide affordances or memorability. The official web site for the gloves recognizes this fact: Users can program as many functions as can be remembered (Fakespace, 2001). In addition, many of the possible gestures are physically implausible. In order to take advantage of the large gesture space, we need to use comfortable pinch gestures that either have natural affordances or whose function the user can easily determine. The gloves also have desirable ergonomic characteristics. They are very light and flexible, so that they do not cause fatigue or discomfort with extended use. Moreover, the user s hand becomes the input device it is not necessary to hold another device in the hand. Thus, users can change the posture of their hand at any time, unlike pens or joysticks, which might force the user to hold the hand in the same posture continuously. They also support eyesoff interaction the user s proprioceptive senses allow him/her to pinch fingers together easily without looking at them, whereas some other devices may force the user to search for the buttons or controls. As noted, the gloves can be combined with tracking devices for spatial input. This allows context-sensitive interpretation of pinch gestures. For example, pinching the thumb and index finger while touching a virtual object may mean select, while the same pinch in empty space may mean OK. The use of trackers also means the gloves can support two-handed interaction, known to be a natural way to provide a body-centric frame of reference in 3D space (Hinckley, Pausch, Profitt, Patten, & Kassell, 1997). Of course, two trackers combined with a button device can also allow two-handed interaction, but the gloves give the user flexibility in deciding which hand will initiate the action. One consequence of this is that users can avoid what we have called the Heisenberg effect of spatial interaction the phenomenon that on a tracked device, a discrete input (e.g. button press) will often disturb the position of the tracker. For example, a user wants to select an object using ray casting. She orients the ray so that it intersects the object, but when she presses the button, the force of the button press displaces the ray so that the object is not selected. With the gloves, one hand can be used to set the desired position/orientation, and the other hand can be used to signal the action, avoiding the Heisenberg effect. Figure 1. User wearing Pinch Gloves Figure 2. TULIP menu system

Finally, Pinch Gloves, unlike whole-hand devices, produce discrete input fingers are either touching or not. Although whole-hand devices can allow the user more natural gestures, it is notoriously difficult to calibrate them and recognize gestures properly (Kessler, Hodges, & Walker, 1995). Pinch Gloves, because of their discrete nature, can be much more precise. 4 Three-dimensional interaction techniques using Pinch Gloves We developed novel 3D interaction techniques based on these properties. Our goal was to push the envelope and to go beyond the simple pinch to grab metaphor that represents the most common use of the gloves. We also wanted to design techniques that would use natural gestures as well as those where the gesture was more abstract. 4.1 TULIP menu system Our design of a menu system using the gloves was based on two concepts. First, menu items would be associated with fingers, and would be selected by pinching the finger to the thumb on the same hand (a comfortable and simple gesture). Second, top-level menu items (menu titles) would be associated with the user s non-dominant hand, and second-level items (commands within a menu) would be associated with the dominant hand. This design is simple to understand, requires no more than two pinches to access a particular menu item, and takes advantage of the natural asymmetry of the two hands (Hinckley et al., 1997). Unfortunately, it also creates an unreasonable limitation of four menus with four items each. Our solution to this problem is called the TULIP menu system (Bowman & Wingrave, 2001). TULIP stands for Three-Up, Labels In Palm, meaning that three items are active at one time, while the rest of the menu items are arranged in columns of three along the palm of the user s hand. To access an active item, the user simply pinches the thumb to the appropriate finger. To access other items, the user pinches the thumb to the pinky (the more item) until the desired item appears on one of the fingers (figure 2). This system nicely balances direct selection and visibility of menu items. Given a maximum menu size of N items, it takes no more than N/3 + 1 pinches to select any item. Visual cues (highlighting, rotation of the active items, and a clear link between the more item and the next column of items) make it easy for the user to determine which menu is selected, which items are active, and how to make the desired item active. A usability study showed that this system was more difficult to learn than two other common VE menu implementations, but also that users became efficient relatively quickly, and that the system was significantly more comfortable than the others. To further address the issue of efficiency, we have developed a modified TULIP system for expert users that requires no more than two pinches to select any item, provided that no menu contains more than fifteen items (a reasonable assumption based on interface guidelines). In this technique, to select the fourth item in a menu (such as the item yellow in figure 2), the user pinches the left ring finger to the right index finger. The lefthand finger determines which column of items the user desires, and the right-hand finger determines which item within that column should be selected. This technique requires more practice, but does not require the user to memorize any information, since all items are visible on the palm. 4.2 Virtual keyboard for text input For the most part, VE applications have avoided the task of entering text information, because no usable text input techniques existed (beyond speech recognition and one-handed chord keyboards, which both require significant user training, and are therefore inappropriate for all but the most frequently used VEs). There are many situations, however, where accurate and efficient text input would increase the utility of VE systems. For example, a user might need to leave an annotation for the designer in an architectural walkthrough application, or enter a filename to which work can be saved after a collaborative design session Based on these examples, a 3D text input technique would neither require users to type long sentences or paragraphs, nor to approach the speed of typing at a standard keyboard. Our goal was to allow immersed users to be able to enter text without leaving the VE and without a significant amount of training. We decided to emulate as closely as possible a standard QWERTY keyboard, with which most users are intimately familiar. The basic concept of our virtual keyboard is that a simple pinch between a thumb and finger on the same hand represents a key press by that finger. Thus, on the home row of the keyboard, left pinky represents a, left ring represents s, and so on. We also needed the ability to use the inner keys such as g and h, and the ability to change rows of the keyboard. We accomplish this through the use of trackers mounted on the gloves. Inner keys are selected by rotating

the hand inward. The active row can be changed by moving the hands closer to the body (bottom row) or farther away (top row). Users calibrate the location of the rows before using the system. Still, because the trackers have limited accuracy, fairly large-scale motions are required to change rows or select the inner keys, reducing efficiency. Special gestures are provided for space (thumb to thumb), backspace (ring to ring), delete all (pinky), and enter (index to index). This set of arbitrary gestures is small enough to be memorized by the user. Visual feedback is extremely important to make up for the lack of the visual and haptic feedback provided by a physical keyboard. We hypothesized that users would not want to look at their hands while typing, so we attached the feedback objects to the view. These show, for each hand, the location of each character, and the currently active characters (based on hand distance and rotation) via highlighting (figure 3). In a user study, five users each typed three sentences (six to eight words) using the virtual keyboard. As expected, efficiency was low: users took about three minutes per sentence on average, after five to ten minutes of practice. Most users improved their performance during the study. Much of their time was spent searching for letters (although four users were touch typists) and recovering from errors. However, our goal of minimal training was met, as all users commented on the ease of learning the system. Moreover, one of the designers could type the sentences in an average of 45 seconds each, indicating that the system could be much more efficient with further experience. Figure 3. Virtual keyboard technique Figure 4. Two-handed navigation technique 4.3 Two-handed navigation The task of navigation in VEs has been widely studied, including a large number of interaction techniques for viewpoint movement, or travel. We wanted to design a travel technique using the gloves that would take advantage of their unique properties, and decided to focus on the gloves support for two-handed interaction. Steering techniques for VE travel (Bowman et al., 1997) let the user provide a direction of motion, usually based on the head or hand tracker. With two tracked gloves, we can define the direction of motion based on the vector between the hands. Since both hands are acting as input devices, we can define the forward direction as being toward the hand that produced the pinch gesture (figure 4). Thus, to fly upwards, the user holds his hands one above the other and pinches thumb to index finger on the top hand. Flying back downwards does not require changing the hand positions; rather, the user simply pinches with the lower hand instead. We enhanced this technique with a velocity control feature (based on the distance between the hands), and a nudge feature, allowing the user to move only slightly in a particular direction by using a different pinch gesture. This technique is quite flexible. In a user study, we found that novice users tend to simulate torso-directed steering by holding one hand close to the body and the other a short distance in front of the body. Expert users have the option to increase their speed for large-scale movements, use both hands to quickly change direction without moving their bodies, and make small movements when it is required by the task at hand. We also found that users performance on 2D navigation tasks improved significantly after a few minutes of practice, but that it was difficult for users to control direction and remain spatially oriented in 3D navigation. 5 Conclusions and future work This research has explored the use of Pinch Gloves for novel three-dimensional interaction techniques. Our techniques confirm that the gloves can be used for natural gestures, but also show that abstract techniques can take

advantage of the gloves characteristics for more efficient, usable, or comfortable interaction. In general, we found that abstract techniques require a high degree of visual affordances and feedback, including virtual representations of the hands and indication of the command that will be activated when a pinch gesture is made. We also found that the gloves provide increased comfort over other common devices, and that the large number of possible gestures allows the same technique to be customized for both novice and expert users. We are continuing to refine the techniques and to develop new ones. For example, we are working on several approaches to numeric input using the gloves, and on lightweight navigation constraints based on glove input. The techniques are also being used in VE applications under development in our laboratory, which will allow us to evaluate them in a setting of realistic use. Acknowledgements The authors would like to acknowledge the technical assistance of Drew Kessler and the time and efforts of the subjects in our usability evaluations. References Angus, I., & Sowizral, H. (1995). Embedding the 2D Interaction Metaphor in a Real 3D Virtual Environment. Proceedings of SPIE, Stereoscopic Displays and Virtual Reality Systems, 282-293. Bowman, D., Johnson, D., & Hodges, L. (1999). Testbed Evaluation of VE Interaction Techniques. Proceedings of the ACM Symposium on Virtual Reality Software and Technology, 26-33. Bowman, D., & Wingrave, C. (2001). Design and Evaluation of Menu Systems for Immersive Virtual Environments. Proceedings of IEEE Virtual Reality, 149-156. Bowman, D., Koller, D., & Hodges, L. (1997). Travel in Immersive Virtual Environments: an Evaluation of Viewpoint Motion Control Techniques. Proceedings of the Virtual Reality Annual International Symposium, 45-52. Cutler, L., Frohlich, B., & Hanrahan, P. (1997). Two-Handed Direct Manipulation on the Responsive Workbench. Proceedings of ACM Symposium on Interactive 3D Graphics, 107-114. Fakespace, Inc. (2001). PINCH Gloves. http://www.fakespacelabs.com/products/pinch.html Gabbard, J. (1997). Taxonomy of Usability Characteristics in Virtual Environments. Unpublished Masters Thesis, Virginia Polytechnic Institute and State University. Herndon, K., van Dam, A., & Gleicher, M. (1994). The Challenges of 3D Interaction. SIGCHI Bulletin, 26(4), 36-43. Hinckley, K., Pausch, R., Profitt, D., Patten, J., & Kassell, N. (1997). Cooperative Bimanual Action. Proceedings of CHI: Human Factors in Computing Systems, 27-34. Hix, D., Swan, J., Gabbard, J., McGee, M., Durbin, J., & King, T. (1999). User-Centered Design and Evaluation of a Real-Time Battlefield Visualization Virtual Environment. Proceedings of IEEE Virtual Reality, 96-103. Jacoby, R., & Ellis, S. (1992). Using Virtual Menus in a Virtual Environment. Proceedings of SPIE: Visual Data Interpretation, 39-48. Kaur, K. (1999). Designing Virtual Environments for Usability. Doctoral Dissertation, University College, London. Kessler, G., Hodges, L., & Walker, N. (1995). Evaluation of the CyberGlove as a Whole Hand Input Device. ACM Transactions on Computer-Human Interaction, 2(4), 263-283. Mapes, D., & Moshell, J. (1995). A Two-Handed Interface for Object Manipulation in Virtual Environments. Presence: Teleoperators and Virtual Environments, 4(4), 403-416. Mine, M. (1995). Virtual Environment Interaction Techniques (Technical Report TR95-018): UNC Chapel Hill CS Dept. Mine, M. (1997). ISAAC: A Meta-CAD System for Virtual Environments. Computer-Aided Design, 29(8), 547-553. Mine, M., Brooks, F., & Sequin, C. (1997). Moving Objects in Space: Exploiting Proprioception in Virtual Environment Interaction. Proceedings of ACM SIGGRAPH, 19-26. Pierce, J., Stearns, B., & Pausch, R. (1999). Voodoo Dolls: Seamless Interaction at Multiple Scales in Virtual Environments. Proceedings of the ACM Symposium on Interactive 3D Graphics, 141-146. Poupyrev, I., Tomokazu, N., & Weghorst, S. (1998). Virtual Notepad: handwriting in immersive VR. Proceedings of the IEEE Virtual Reality Annual International Symposium, 126-132. Zhai, S., & Milgram, P. (1993). Human Performance Evaluation of Manipulation Schemes in Virtual Environments. Proceedings of the Virtual Reality Annual International Symposium, 155-161. Zhai, S., Milgram, P., & Buxton, W. (1996). The Influence of Muscle Groups on Performance of Multiple Degreeof-Freedom Input. Proceedings of CHI: Human Factors in Computing Systems, 308-315.