Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Similar documents
Are Existing Metaphors in Virtual Environments Suitable for Haptic Interaction

Using the Non-Dominant Hand for Selection in 3D

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Haptic presentation of 3D objects in virtual reality for the visually disabled

Fly Over, a 3D Interaction Technique for Navigation in Virtual Environments Independent from Tracking Devices

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Visual Debugger forsingle-point-contact Haptic Rendering

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Haptic control in a virtual environment

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Gesture in Embodied Communication and Human-Computer Interaction

Guidelines for choosing VR Devices from Interaction Techniques

Alternative Interfaces. Overview. Limitations of the Mac Interface. SMD157 Human-Computer Interaction Fall 2002

The Phantom versus The Falcon: Force Feedback Magnitude Effects on User s Performance during Target Acquisition

Image Characteristics and Their Effect on Driving Simulator Validity

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

A Kinect-based 3D hand-gesture interface for 3D databases

Methods for Haptic Feedback in Teleoperated Robotic Surgery

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Toward an Integrated Ecological Plan View Display for Air Traffic Controllers

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

A Study on the Navigation System for User s Effective Spatial Cognition

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

RASim Prototype User Manual

Interactive Exploration of City Maps with Auditory Torches

Comparison of filtering methods for crane vibration reduction

Tangible interaction : A new approach to customer participatory design

The Haptic Perception of Spatial Orientations studied with an Haptic Display

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Design and evaluation of Hapticons for enriched Instant Messaging

Benefits of using haptic devices in textile architecture

Perception of room size and the ability of self localization in a virtual environment. Loudspeaker experiment

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger.

Multi variable strategy reduces symptoms of simulator sickness

The Perception of Optical Flow in Driving Simulators

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Enhancing Fish Tank VR

A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Comparison of Haptic and Non-Speech Audio Feedback

CSC 2524, Fall 2017 AR/VR Interaction Interface

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

3D User Interaction CS-525U: Robert W. Lindeman. Intro to 3D UI. Department of Computer Science. Worcester Polytechnic Institute.

Testbed Evaluation of Virtual Environment Interaction Techniques

A Virtual Environments Editor for Driving Scenes

Gesture-based interaction via finger tracking for mobile augmented reality

FORCE FEEDBACK. Roope Raisamo

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Enhancing Fish Tank VR

ProMark 500 White Paper

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

Perception of Haptic Force Magnitude during Hand Movements

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

3D interaction techniques in Virtual Reality Applications for Engineering Education

Réalité Virtuelle et Interactions. Interaction 3D. Année / 5 Info à Polytech Paris-Sud. Cédric Fleury

Chapter 1 Virtual World Fundamentals

HUMAN COMPUTER INTERFACE

Automatic Online Haptic Graph Construction

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY

Lab 7: Introduction to Webots and Sensor Modeling

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

3D Interaction Techniques

CSE 165: 3D User Interaction. Lecture #11: Travel

Project Multimodal FooBilliard

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Classifying 3D Input Devices

CHAPTER 2 CURRENT SOURCE INVERTER FOR IM CONTROL

Mid-term report - Virtual reality and spatial mobility

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

Running an HCI Experiment in Multiple Parallel Universes

Classifying 3D Input Devices

Differences in Fitts Law Task Performance Based on Environment Scaling

WINGS3D Mini Tutorial

Immersive Simulation in Instructional Design Studios

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Learning From Where Students Look While Observing Simulated Physical Phenomena

Application and Analysis of Output Prediction Logic to a 16-bit Carry Look Ahead Adder

Gaze-controlled Driving

Salient features make a search easy

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

Application and Taxonomy of Through-The-Lens Techniques

Designing Tactile Vocabularies for Human-Computer Interaction

Simultaneous Object Manipulation in Cooperative Virtual Environments

Using Pinch Gloves for both Natural and Abstract Interaction Techniques in Virtual Environments

EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK

Transcription:

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium {joan.deboeck, karin.coninx}@luc.ac.be Abstract Many applications that support force feedback make use of a haptic device (such as the PHANToM) used for pointing operations, combined with a second device mainly used for navigation (such as a SpaceMouse). In our former research project we introduced the Camera In Hand Metaphor in which we use the PHANToM device for camera manipulations. This allows us to eliminate the second device and hence to free the user of the mental load to drive two different devices. Eliminating the second device, also allows the user to use his second hand for another task, such as to steer a second PHANToM Device. In this paper we report about an improvement of the Camera In Hand Metaphor in such a way that it better fits to the needs and expectations of the experienced users. Those improvements have been assessed in a formal user experiment. 1. Introduction In our previous work, we introduced the Camera In Hand Metaphor as an experiment to use the PHANToM as a camera manipulation device. The metaphor was built to be a solution to free the user from the mental load of driving two different devices. In the future, freeing the user s second hand, also allows us to use the second hand for other tasks. In the former experiment, the Camera In Hand Metaphor turned out to be much more efficient for novice users in respect to the standard metaphors using the 3D mouse. Experienced users, however, still preferred the classical navigation device, although no objective difference could be measured. In this paper, we describe how we have extended the former Camera In Hand Metaphor (CiH), in such a manner that the disadvantages, the experienced users take notice of, are avoided. We call our extension the Extended Camera In Hand Metaphor (ecih). In this document, we first will place our work in the scope of related work. We will then shortly describe the facts and results of the former Camera In Hand Metaphor experiment, as the presented work builds upon these findings. Next we will elaborate on the extension of the metaphor and give a motivation of the proposed ideas. We will end this contribution by stating our conclusions, based on a formal user experiment. 2. Overall context 2.1. Navigation metaphors To our knowledge, not much can be found in literature about the integration of force feedback in camera manipulation metaphors. However, navigation and camera control in general 3D environments have been investigated thoroughly, but we mention here just a few examples. In the early 90 s, C. Ware describes three different camera metaphors [1] for general use in 3D or virtual environments without particular attention for haptics. Flying vehicle and Scene in hand are the most commonly known. Besides of this, he describes the Eyeball in hand metaphor (in which the user holds a tracker in his hand, and hence holds and manipulates his virtual eyeball). This turned out to be a very confusing metaphor, and so is less commonly applicated. Other work has been done in improving navigation and wayfinding methods in virtual environments [2]. In some research systems hand-held miniatures [3] or

Speed-coupled Flying [4] are presented to facilitate the user s interaction. The work of T.G. Anderson [5][6] has been a motivation for us to consider navigation and camera control specifically in the haptics context. He conducted a usability test that provided evidence that navigation using Sensable s PHANToM device results in a better performance compared to the 2D navigation interface of CosmoPlayer. 2.2. The Camera In Hand metaphor As this current contribution builds upon our earlier findings, this paragraph will shortly describe the aim and results of our former solution: the Camera In Hand Metaphor. To eliminate the need for a second input device for camera manipulations, such as the LogiCad SpaceMouse which has been used in many classical haptic setups, we have extended the eyeball in hand metaphor [1] using force feedback and the PHANToM Device. In our solution the user was holding the PHANToM s stylus, which represents the viewing direction of the virtual camera. According to the movements and the rotations of the stylus, the user can look around in the virtual scene. We called this solution Camera In Hand [8]. Although Eyeball in hand seemed to be confusing to the user, our metaphor turned out to the contrary. We conducted a formal usability test in which experienced and non-experienced users, both male and female, had to complete a navigation task in a virtual arena (Fig.1), with different metaphors. We could conclude that the Camera In Hand Metaphor dramatically improved the performance of the novice users that didn t have any experience in 3D environments. On the other hand, even though experienced users didn t performed better in one or the other condition, most of them preferred the classical interaction devices. This group mostly complained about the limited workspace of the navigation, as a drawback of our solution. Fig 1. Virtual Arena The interested reader can find details concerning our former exp eriment and the Camera In Hand Metaphor in [8]. The remainder of this document will describe the solutions we propose to counter the aforementioned critics of our experienced group of test-persons. 3. Navigation metaphor extensions Based on his usability test in [5], T.G. Anderson incorporated a craft metaphor in the E-touch framework [6][7]. In this metaphor, the virtual camera is standing on a craft (flying vehicle metaphor). By pushing the PHANToM s stylus against the bounds of a virtual box, the craft is moving in the appropriate direction. To step out of the limited workspace of our Camera In Hand Metaphor (reported by our subjects in the first usability test), we have combined the ideas of Anderson s craft-solution together with our former CiHsolution. This solution allows the user to directly manipulate the camera position and hence quickly look around in the scene by pointing the stylus in any direction within his limited workspace (as in [8]). By pushing against the bounds of a virtual box, the craft on which the camera is standing will move in the appropriate direction (as in [5]). The magnitude of the user s force to push against the PHANToM s force feedback, controls the velocity of the craft (Fig. 2). The generated force feedback will help the user to distinguish between the two different modes. To draw the user s attention more to the alternation of the navigation method, auditive feedback has been added as an extra modality: by pushing harder against the wall, the craft s velocity will be higher and so will the frequency and the Fig 2. Bounding Box Fig 3. Rotation Threshold Fig 4. Haptic Plane

volume of the sound of a driving vehicle. The same solution has been used when dealing with rotations. When the rotation exceeds a certain threshold (Fig. 3), the craft will automatically start rotating, while playing auditory feedback in the form of a rotating gearwheel. Most PHANToM models (except for the 6DOF) do not generate torque feedback. The auditory feedback is the only modality to give rotational feedback and therefore is supposed to be more important. Since in our every-day real world, we most of the time rotate around just one axis (Y-axis) the metaphor extension only has been added to this axis. All other rotations still keep the original CiH-Metaphor until the end of the PHANToM s range. Finally, as can be seen from Fig. 4, like in the original Camera In hand, we have kept the virtual guiding plane. Since our physical movements most of the time are limited to one horizontal plane, we offer the user more stability when walking around in this plane. The user has to overcome a small resistance force to change his altitude. 4. Assessment of the extension To validate our extension, 10 experienced users were asked to participate in a user experiment. Our subjects, all right-handed males with an average age of 30, had to perform exactly the same test as in [8], while measuring the same dependent variables: all of the participants had to navigate in a virtual arena to locate and read a digit on a red-white coloured object (see Fig. 1). This test had to be performed under the different conditions; each condition consisted of 15 trials. During each trial the elapsed time had been logged. Finally, at the end of the test a comparative questionnaire had to be filled-up by the subjects. Two reference conditions (SpaceMouse and CiH) had to be performed, to compare the results of both experiments. Additionally this last experiment measured the performances with the ecih-metaphor with and without auditory feedback, as well. P-Value SpaceMouse ecih 0.78 SpaceMouse - Camera In Hand 0.17 CiH - ecih 0.25 Table 2. P-values using ANOVA Although table 1 and 2 do not show any significant difference (probably due to a smaller test-set than we had in our previous experiment), we clearly can distinguish a trend. The ecih-metaphor turns out to be a valuable alternative for the SpaceMouse. We can see an improvement of about 2 seconds in completion times between the old CiH and the ecih version. There seems to be no difference at all between the Enhanced Camera In Hand condition with or without sound. In the next section, we will compare those results against the values collected in our earlier test. 5. Comparing Results Because this work is a continuation of our earlier work with regard to camera and navigation metaphors, it is important that the similarities in the experimental setups are maximized. Since users in our new experiment had to perform the same task as in the earlier experiment, while they got the experiment s explanation from the same (written) document, we assume we can compare both result-sets to each other. To verify this assumption, subjects in the latter experiment had to perform the same condition as those of the first experiment (SpaceMouse and CiH). The table 3 below shows the reference values. Old Values (ms) New Values (ms) p-value SpaceMouse 9760 8014 0.05 CiH 11059 10333 0.75 Table 3. Comparison of values between two experiments Average (ms) SpaceMouse 8014 CiH 10333 ecih (Sound) 8274 ecih (No Sound) 8302 Table 1. Average results of the test

We can see a strong correlation between the two result-sets in the Old Camera In Hand condition, but surprisingly, we also notice a significant difference between the values of the SpaceMouse condition. Inquiring the subjects of both result-sets about their experiences with both devices, we can conclude that our test-persons in the recent experiment, almost all had some experience with the SpaceMouse, while the others didn t. This information can explain the significant difference between both test-sets. We assume this bias in the SpaceMouse-condition to have no (or little effect) on the results of all Camera In Hand conditions. So we will compare the current values against the values of the former experiment, keeping in mind that our results have to be put into perspective. (Maybe our latter subjects turn out to be faster in general). If we compare the results of the Enhanced Camera In Hand to the old results of the Old Camera In Hand, we do see a significant improvement (p-value=0.048). We do not find a significant difference between the old values of the SpaceMouse-condition and the Enhanced Camera In Hand (p-value=0.11). 6. Discussion In our former experiment we could not distinguish a significant difference between the SpaceMouse and the Camera In Hand condition, although experienced users consistently preferred the SpaceMouse to navigate. In the latest experiment, probably because of their experience, the values of the SpaceMouse-condition turn out to be significantly lower. Our new Extended Camera In Hand metaphor turns out to have no significant difference with the lowest values of the SpaceMouse. A questionnaire, in which subjects have to give their preference, resulted in an equal distribution between SpaceMouse and Enhanced Camera In Hand Metaphors. This allows us to conclude that the extension of our navigation metaphor turns out to be an improvement over the former CiH-metaphor. It is also shown that ecih is a valuable alternative for the SpaceMouse (based on the lowest test-set), and hence confirms our efforts to eliminate the second input device, allowing the user to use his second hand for other tasks (such as a second PHANToM device). As mentioned before, force-feedback allows the user to distinguish between the two modes of the Enhanced Camera In Hand. Auditory feedback has been added as a second modality in one condition. Objectively spoken, auditory feedback has no benefits, however 80% of the users who preferred the new metaphor, appreciated the sound. Observing the users, while performing the test, we could see that users more easily discovered the possibilities of the camera metaphor while auditory feedback is present. Also, users tend to push the PHANToM less in extreme positions. 7. Conclusion and Ongoing Work In this paper we presented the Enhanced Camera In Hand Metaphor, an extension to the Camera In Hand Metaphor[8] in such that it can be an alternative for the SpaceMouse for experienced users. From a user experiment, we can conclude that this new extension has a significant improvement over the original metaphor. Moreover, the enhanced version turns out to perform equally to the SpaceMouse, while users subjectively choose mixed between both conditions. We have to interpret these results with care, however, as a significant bias in the results of the SpaceMouse-condition exists. We believe the long-term result of this research consists in the elimination of the SpaceMouse for navigation. This is an important step towards our goal: multimodal interaction in virtual (haptic) environments. Currently we are investigating the value of allowing the user to interact simultaneously with two PHANToM devices, which is enabled by the results presented in this paper. At last, voice commands may turn out to be constructive in this context (e.g. to switch between different operation modes). 8. Acknowledgements We want to thank our test subjects to free their time to participate to this user experiment. In special, I want to thank my colleague Chris Raymaekers to assist me in interpreting the results of the test. 9. References [1] C. Ware, S. Osborne. Exploration and Virtual Camera Control in Virtual Three Dimensional Environments. Computer Graphics 1990 Vol 24 Nr 2 [2] G.A. Satalich. Navigation and Wayfinding in Virtual Reality: Finding Proper Tools and Cues to Enhance Navigation Awereness. http://www.hitl.washington.edu/publications/satalich/home. html [3] R. Pausch, T. Burnette. Navigation and Locomotion in Virtual Worlds via Flight into Hand-Held Miniatures. Computer Graphics 1995, Annual Conference Series, p 399-400

[4] D.S. Tan, G.G.Robinson, M. Czerwinski. Exploring 3D Navigation: Combining Speed-coupled Flying with Orbiting. CHI 2001 Conference Proceedings p. 418-425, Seattle, Washington, USA [5] T.G. Anderson. FLIGHT: An Advanced Human-Computer Interface and Application Development Environment. Thesis Master Of Science, 1998, University of Washington. [6]T.G. Anderson, FLIGHT --- A 3D Human-Computer Interface and Application Development Environment, Proceedings of the Second PHANToM Users Group Workshop, 1997, October 19-22, Dedham, MA, USA [7] Novint 2001, e-tough Programmers Guide [8] J. De Boeck, C. Raymaekers, K. Coninx, Expanding the Haptic Experience by Using the PHANToM Device as a Camera Metaphor, Phantom Users Group 2001, 2001, October 27 30, Aspen, Colorado, US