Haptic Feedback in Mixed-Reality Environment

Similar documents
MHaptic : a Haptic Manipulation Library for Generic Virtual Environments

VR Haptic Interfaces for Teleoperation : an Evaluation Study

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Improved Third-Person Perspective: a solution reducing occlusion of the 3PP?

VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr.

Force feedback interfaces & applications

Application of 3D Terrain Representation System for Highway Landscape Design

FORCE FEEDBACK. Roope Raisamo

UMI3D Unified Model for Interaction in 3D. White Paper

Haptic presentation of 3D objects in virtual reality for the visually disabled

Toward an Augmented Reality System for Violin Learning Support

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

A Modular Architecture for an Interactive Real-Time Simulation and Training Environment for Satellite On-Orbit Servicing

Description of and Insights into Augmented Reality Projects from

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

Augmented Reality. Virtuelle Realität Wintersemester 2007/08. Overview. Part 14:

MRT: Mixed-Reality Tabletop

Haptic Rendering and Volumetric Visualization with SenSitus

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

A Movement Based Method for Haptic Interaction

Chapter 1 - Introduction

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

Building a bimanual gesture based 3D user interface for Blender

MANPADS VIRTUAL REALITY SIMULATOR

Peter Berkelman. ACHI/DigitalWorld

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

R (2) Controlling System Application with hands by identifying movements through Camera

Augmented Reality Mixed Reality

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

A STUDY ON DESIGN SUPPORT FOR CONSTRUCTING MACHINE-MAINTENANCE TRAINING SYSTEM BY USING VIRTUAL REALITY TECHNOLOGY

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

VR for Microsurgery. Design Document. Team: May1702 Client: Dr. Ben-Shlomo Advisor: Dr. Keren Website:

Using Web-Based Computer Graphics to Teach Surgery

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Haptics CS327A

Evaluation of Five-finger Haptic Communication with Network Delay

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Haptic/VR Assessment Tool for Fine Motor Control

3D interaction techniques in Virtual Reality Applications for Engineering Education

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Haplug: A Haptic Plug for Dynamic VR Interactions

Geo-Located Content in Virtual and Augmented Reality

The Use of Virtual Reality System for Education in Rural Areas

AR 2 kanoid: Augmented Reality ARkanoid

Air-filled type Immersive Projection Display

Virtual Environments. Ruth Aylett

The 5th International Conference on the Advanced Mechatronics(ICAM2010) Research Issues on Mobile Haptic Interface for Large Virtual Environments Seun

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

The Control of Avatar Motion Using Hand Gesture

VR-OOS System Architecture Workshop zu interaktiven VR-Technologien für On-Orbit Servicing

Advanced Mixed Reality Technologies for Surveillance and Risk Prevention Applications

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Hand Tracking and Visualization in a Virtual Reality Simulation

Affordance based Human Motion Synthesizing System

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

VIRTUAL REALITY LAB Research group Softwarevisualisation in 3D and VR

Immersive Interaction Group

Surgical robot simulation with BBZ console

Technical Specifications: tog VR

EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK

IN RECENT years, there has been a growing interest in developing

SMart wearable Robotic Teleoperated surgery

HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS

VR System Input & Tracking

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

Job Description. Commitment: Must be available to work full-time hours, M-F for weeks beginning Summer of 2018.

The use of gestures in computer aided design

The Design of Teaching System Based on Virtual Reality Technology Li Dongxu

Haptics Technologies: Bringing Touch to Multimedia

Development of K-Touch TM Haptic API for Various Datasets

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

Augmented Reality Lecture notes 01 1

A haptic rendering system for virtual handheld electronic products

Overview of current developments in haptic APIs

INTERIOUR DESIGN USING AUGMENTED REALITY

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Accuracy evaluation of an image overlay in an instrument guidance system for laparoscopic liver surgery

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Module 1: Introduction to Experimental Techniques Lecture 2: Sources of error. The Lecture Contains: Sources of Error in Measurement

Interior Design using Augmented Reality Environment

ISO INTERNATIONAL STANDARD. Ergonomics of human-system interaction Part 910: Framework for tactile and haptic interaction

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Natural Gesture Based Interaction for Handheld Augmented Reality

Augmented Reality- Effective Assistance for Interior Design

PROPRIOCEPTION AND FORCE FEEDBACK

STE Standards and Architecture Framework TCM ITE

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Virtual Co-Location for Crime Scene Investigation and Going Beyond

A Hybrid Immersive / Non-Immersive

Transcription:

The Visual Computer manuscript No. (will be inserted by the editor) Haptic Feedback in Mixed-Reality Environment Renaud Ott, Daniel Thalmann, Frédéric Vexo Virtual Reality Laboratory (VRLab) École Polytechnique Fédérale de Lausanne (EPFL) CH-1015 Lausanne, Switzerland e-mail: {renaud.ott, daniel.thalmann, frederic.vexo}@epfl.ch Received: date / Revised version: date Abstract The training process in industries is assisted with computer solutions to reduce costs. Normally, computer systems created to simulate assembly or machine manipulation are implemented with traditional Human- Computer interfaces (keyboard, mouse, etc). But, this leads usually to systems that are far from the real procedures, and thus not efficient in term of training. Two techniques could improve this procedure: mixed-reality and haptic feedback. We propose in this paper to investigate the integration of both of them inside a single framework. We present the hardware used to design our training system. A feasibility study allows to establish testing protocol. The results of these tests convince us that such system should not try to simulate realistically the interaction between real and virtual objects as if it was only real objects. 1 Introduction In the industry, the traditional training of workers to use special equipment is normally carried out using a part or full real equipment. This could be afforded by the industry itself or specialized centers for training. But it brings many drawbacks like: the cost of equipment just for training is too high; machines are innovating and training equipment should change; new products or improvements of the production line which implies new training; outsourcing training with specialized centers, etc. Beside this kind of training there is also more specialized training like aviation or surgery where it is not always possible to use the real equipment and to check all the cases that the trainee could face. Because of this, the help of computer solutions has been considered. They offer lower cost and more adaptability. The simulation of a working environment with computers is done by means of Virtual Reality (VR). In these applications we are able to build any kind of scenarios, tools and equipment. A complete and detailed Fig. 1 A Mixed-Reality industrial training environment [?] simulation of some scenarios could be very complex to develop, and moreover it is still difficult to produce truly convincing results. Thus, to reduce the programming effort and also to simulate better the reality, Mixed Reality (MR) provides a good solution [?]. This consists in superpositioning real images (pictures or video) inside of virtual world or vice versa. It can provide a complete real scene with virtual elements that help with the training process, as it is shown on the figures 1 achieved in the framework of the STAR European project. These technologies are affordable and good enough to simulate working cases. They can show the proper way to play a role inside a context. Normally, these technologies are limited to keyboard or mouse interaction, in some cases other user interfaces are used, like large screens or touch screens. But this is still far from real, and far from the benefits of the traditional training process with real equipment. Thus, we propose to improve the interaction in such mixed-reality training environments using haptic technologies. in order to provide to the user the possibility to manipulate 3D objects with his both hands. The benefit of manipulating objects is to teach the user in a practical manner the proper way of performing tasks. For example, in assembly process: the user can manipulate virtual objects and position them. In this paper, we propose a generic assembly training system, which takes advantages of mixed-reality techniques, with

2 Renaud Ott et al. haptic feedback. To illustrate our words, we will describe an application of table assembly, with virtual (the feet) and real (the board) parts. Next section presents an overview of the mixed-reality techniques and applications, and observations about the haptic rendering for manipulation tasks. The rest of the article deals with the system that we have created to test the haptic feedback in a mixed-reality environment. First, we will present the hardware used: haptic device, tracking system and head-mounted display. Then, we will present the testing protocol used. And finally, the paper ends by presenting the general recommandations that we have extracted from our experience. 2 Related Works In this section we will present some systems which use haptic interfaces, virtual/mixed reality to simulate assembly or manipulation tasks for training purposes. Concerning mixed-reality, the work of Azuma [?] gives an overview on the recent advances in the field. In his article, haptic user interface are discussed as a new approach. In VTT project [?], is presented a virtual technical trainer for milling machines. Authors use as prototypes three kind of force feedback devices: the Phantom, a home-made 2DOF haptic device, and a pseudo-haptic technique. They present, in [?], an evaluation of these devices considering the efficiency criteria of the industry. Assembling training has been also addressed for aeronautic purposes in [?]. Authors use a Phantom to simulate mounting/unmounting operation of different parts of an aircraft. These works present virtual environments to simulate machines or scenarios; and use generic or specific haptic interfaces. However, these haptic devices, like the Phantom R [?],only provide force feedback on a particular point, which make them limited because people are not be able to use their hands to interact with the training system. The use of Mixed Reality has also been considered in the assembly process. In [?], Zauner et al. propose a virtual assembly instructor based on mixed reality. The user uses a see-through Head Mounted Display to see overlayed interesting information to help him to assemble furniture. Here, the user interacts with real objects using his hands, but the system is limited to real objects manipulation. Another example of interaction with real objects which moreover provides haptic feedback is in [?]. The authors use sensors to perceive the real environment, and transmit these sensors information to a 6-DOF haptic display with augmented force feedback. This is a truly augmented haptic system because the user is able to feel haptic textures of objects that he could not feel with is real hand (like bumps of a sheet of paper). An approach of hands interaction with virtual objects is addressed by Walairacht et al. in [?]. They present Fig. 2 General scheme of the four hardware modules of our application a manipulation system of virtual objects where 4 fingers of each hand of the user are inside of a string-based haptic device allowing to feel the virtual objects. Moreover it is a mixed-reality system because the video of the hands is overlaid on the virtual world to have a better visualization of the hand posture. But in this system the user can only manipulate virtual objects. Recently, in [?], Bianchi et al. have presented a study on the calibration of an augmented reality system that uses a Phantom. The method chosen in our paper to calibrate the system is similar to their method. In this paper we provide the possibility to interact with real and virtual objects at the same time. The user will be able to use his both hands by the mean of a Haptic Workstation TM which is a generic haptic hardware. We present a sample application that uses virtual and real parts: the assembly process of a mixed-reality table. The next section provides a complete system description of the framework. 3 System Architecture In a training context, haptic and visual, real and virtual, should be brought together within a single application. The feasibility application that we elaborate consists in building a MR table with a scale of 1/4. It is constituted by a 55cm long and 22cm large piece of wood that contains also four holes where the feet are driven in. Four virtual objects stored as a 25cm long cylinder shape represents the feet. In this section, we present the devices and the software used to create such application: a haptic system, a tracking system, a see-through head-mounted display (HMD). They are combined as it is illustrated in figure 2. The Haptic Workstation TM device is described in the first subsection. Then, we discuss about the tracking system of the real objects. And finally, we present some important facts about the assembly training system. 3.1 Haptic Interface The Haptic Workstation TM is composed by four usual devices of virtual reality. A pair of CyberGloves used

Haptic Feedback in Mixed-Reality Environment 3 Fig. 3 The Immersion Haptic Workstation TM Fig. 4 The three main threads running with MHAPTIC. for acquiring hand posture. They are used to build a mesh representing the hand. This mesh is only used with the collision detection system since in this mixed-reality framework, we do not display the hands. There is also a pair of CyberGrasp used to add force feedback on each finger. It is a one-direction force feedback, specially designed for grasping simulation. Concerning the force feedback on the arms, a pair of CyberForce TM which is an exoskeleton used to convey a 3D-force located on the wrist. This device could not be used to change the orientation of the hand. In our framework we use it to simulate the weight of the grasped objects, the collision with the virtual objects, and to provide a haptic guidance mechanism. Finally, a pair of CyberTrack TM encapsulated in the CyberForce device to get the position and the orientation of user hands. Refresh rate of this device is very high (nearly 800Hz) and accurate: they detect a 0.1mm movement and a change in the orientation of 1/10 o. In the next subsection, we present the haptic rendering software to manage this Haptic Workstation. 3.2 Haptic Rendering Software The Haptic Workstation TM is a not a usual device: The user interacts mainly with its hands. Comparing to a Phantom R, where the user interacts using a single point (the fingertip or a pencil), the computation of collision detection and force feedback response is more complex. Existing libraries (Chai3D, OpenHaptics, ReachIn) do not really address this problem (except Virtual Hand, but this last one has other drawbacks: static scene, usability, etc.). Thus we have created a new framework allowing interacting with hands and computing appropriate force feedback: it is internally called MHAPTIC, by analogy with MVISIO [?], a pedagogic multi-device visual rendering engine developed in our laboratory. We will not go into an exhaustive description of the library. We can mention that the library runs three concurrent threads as presented on the graphic figure 4. It is Fig. 5 Photo taken from the user point of view, and augmented with what is displayed in the HMD commonly stated that a correct haptic feedback should be refreshed near 1000Hz, and the visual feedback near 60Hz. The physic thread embeds also a collision detection system, and a dynamic engine. This is build using the AGEIA Novodex library. 3.3 See-Through Head Mounted Display In a mixed-reality system, virtual and real should be visually blended. Usually, two kind of devices allow that: Video Head Mounted Display and see-through Head- Mounted Display (HMD). Our implementation uses the Sony Glasstron PLM- S700 see-through HMD. Advantage of such HMD in comparison with video HMDs is the quality of the real environment display: the reality is not pixelized. However, there is also drawbacks: they are usually semi transparent, and a virtual object could not completely occlude the reality. Moreover, the Glasstron HMD has tinted lenses (It could vary from opaque to tinted as standart sunglasses). Thus, the color of the real environment is altered. But it in a bright room, it does not really affect the user experience. This HMD is calibrated using the SPAAM method [?]. It displays only the virtual feet because they are the only virtual object (see figure 5).

4 Renaud Ott et al. 3.4 Tracking Device Under mixed-reality conditions, real and virtual have to be well-aligned to avoid confusing the user. Moreover, with a haptic enhanced framework, real and virtual objects must collide each other, user should be able to interact with virtual objects as well as with real objects. This implies that we know the shape and the position of each objects of the system in realtime. This is not really a problem for the virtual objects, but, it is of course an unknown for real elements. As we have restricted our system to rigid objects, the shape of real objects could be statically stored. But the position and orientation values are dynamic, and have to be estimated for real objects during the simulation. In our feasibility study, three objects have to be tracked: the user s head (the HMD in fact), the board of the mixed-reality table, and the table where all the objects are putted (see photo and schema in figure 6 and 2). We have used two different tracking methods. The first one could be considered as a software solution since it is based on the library ARToolkit: it uses only a standard webcam. We track the board with this method because it is truly wireless. The second one is a complete hardware dedicated system : this hardware is provided by PhaseSpace Inc., and consists in linear hi-resolution cameras that track LEDs. The LEDs have to be connected to a little box (size of a PDA) that communicates wireless with the main controller. In our case, the workspace is located around the Haptic Workstation TM (it sizes 1, 5m 1, 0m 1, 0m). Inside, an estimation of the position of each LED is given with a 1mm accuracy. Combining at least 3 LEDs on a rigid object allows for extrapolating the orientation: this is the method that we choose to track the HMD and the support for the MR table. 3.5 Assembly Training System The hardware and software that we described in previous sections meet the requirements for creating a mixedreality application. The real objects can interact with the virtual ones. The user is able to grasp a virtual foot. This is managed by the MHAPTIC library. Then a haptic guidance system tries to move the user s hand in the location of the nearest board hole. This is achieved by applying a force vector to his hand whose direction is equal to the foot extremity/board s hole vector. The norm of the vector diminishes with the distance. When a virtual foot collides with one hole of the table and that the foot is perpendicular to the board, the force feedback response simulate the driving-in feeling. 4 Results and Evaluation of the System In this section, we first present the testing protocol, and then we give a general evaluation of the complete sys- Fig. 6 Photo of the devices used to build our Mixed-Reality system tem. Finally, we elaborate recommendations, based on our experience, to build an efficient Mixed-Reality system that includes force feedback. 4.1 Experimentations The described system integrates complex and heterogeneous VR devices that are not designed to work together. These devices need calibration procedures (we create it for the Haptic Workstation TM [?], and we used SPAAM [?] for the HMD). These calibration procedures could introduce errors, and the sum of these errors could lead to an unusable system. This subsection presents tests that will be useful to evaluate objectively these errors. When dealing with mixed-reality and haptic applications, it is important to have an efficient mix between real and virtual. This is achieved by two components: the tracking of the real dynamic objects, and the projection of the virtual objects using the HMD. This lead to the first test which consists in measuring the difference between virtual and real environment: we ask to a user to grasp a virtual foot and to try to place it visually inside the hole of the table. Within perfect conditions, the system should detect that a foot is inside a hole and apply the driving-in force feedback. However two approximations have been done: first, the board position is evaluated by the tracking system; second, the virtual foot is displayed with the HMD and does not superpose perfectly on the reality. Thus, by measuring the distance between the virtual foot and the board s hole as they are stored in the system when they should be aligned, we approximate the addition of these two errors. We performed this test many times, moving the head and the board inside the workspace and we present the results on figure 7. Second test quantifies how the user is perturbed by this difference: is he able to assemble the table under

Haptic Feedback in Mixed-Reality Environment 5 Fig. 7 Distance between Real and Virtual environments measured by the first test (35 measures). these conditions? In normal condition, the user sees only the real table board and the virtual feet. Thus, we compare the time taken to assemble this mixed-real table and the time taken to assemble a complete virtual table (without see-through). Finally, we have also done a test including the haptic guidance system: when the user grasps a virtual feet, he feels a force guiding his hand to the position where he can assemble the feet to the board. In this last situation, we can also evaluate if the user is perturbed of being guided to a place where visually, he is not supposed to assemble the table. To perform this test, we have ask to six persons to try the system. Usually, we ask to people that do not have a particular background in haptics and VR. However, in this case, we consider both the fact that the devices are complex, and that even if this system was applied to the industry the trainee should have a period of accommodation with the devices. Thus, we chose to ask to people knowing VR devices (and especially the tracked HMD). Three challenges have been created: 1. To build the table in a completely virtual environment. The tables s board is then virtual, and not tracked by ARToolkit. 2. To build the Mixed-Reality table. 3. To build the Mixed-Reality table, with the haptic guidance system. The order is randomly sorted for each tester in order to cancel a kind of accommodation effect when we compute the mean time. We measure the time taken to perform these actions. Moreover, we gather oral feedback of the user after their test. We present the times in the table 1. 4.2 Evaluation and Recommandations The previous subsection describes the testing protocol of our system. In this part, we extract results from it in order to finally elaborates recommandations when creating applications combining Mixed-Reality and Haptic Feedback. The first test presents an important fact: despite all the calibration procedures, the matching difference between the real and virtual world is still high. The mean Test 1 2 3 Tester A 1m05 4m30 1m30 Tester B 0m55 2m00 1m25 Tester C 1m30 5m00 (Max) 1m50 Tester D 1m00 1m30 1m30 Tester E 0m45 2m10 1m15 Tester F 1m45 5m00 (Max) 2m10 Mean Time 1m10 3m02 1m37 Rank 1 3 2 Table 1 Times to build the virtual and mixed-reality table by each user. is around 3, 4cm, and the standard deviation is high (0, 95cm): this is because errors are sometimes cumulated sometimes canceled. Moreover, with these results, we present only the difference norm: but we remarked that the difference vectors are in every directions of the space. Thus, it seems to be difficult to find a correction improving the matching using the hardware that we have. After more detailed investigation, the main errors in the calibration procedure are located at the display level. Using the optical see-through HMD calibrated with the SPAAM procedure, a displacement of this one on the face of the user during the manipulation is difficult to avoid. In [?], the authors have used a videothrough HMD, device that avoid the difficult calibration of the HMD. Second test shows that the assembly procedure is more easy when having only virtual objects, and that our mixed-reality system is not able to be as fast and efficient than an entirely virtual one. However, as mentioned in the introduction, it is sometimes impossible to have a completely virtual environment for many reasons (cost, complexity) and sometimes the goal of a training system is to teach using the real equipment itself. In these conditions, with a simple feasibility study, we have shown that it is difficult to manage haptic assembly with mixed-reality. This is mainly due to the visual sense that is not truly convincing. Hopefully, we have shown that some haptic techniques could help: the haptic feedback guidance, for example is very efficient in these conditions. The testers understand well that the virtual and real visual environment are not perfectly superposed, and that they will better apprehend the mixed-reality world with the help of the haptic guidance. Now, the main question is to evaluate how much the differences between virtual and real, visual and haptic, perturbs the learning curve of the trainee. According to the discussions with the testers, we believe that, in the assembly/manipulation context, the important point is the order of the actions/movements. In such case, haptic feedback and guidance is a good tool because it provides the enactive knowledge that the trainee should acquire. Finally, we remark that these tests provide good indications on the way to build a haptic system under

6 Renaud Ott et al. mixed-reality conditions. As it is explained in the previous paragraphs, the perfect visual matching is difficult to reach. Some studies on pseudo-haptic feedback have shown that the visual channel influences the haptic perception [?]. Thus, a realistic haptic feedback is not mandatory since it will be anyway perturbed by the haptic/visual misalignment. However, augmented haptic feedback like the haptic guidance mechanism provides a good solution to build an efficient system. This is the main result of this paper. 5 Conclusion In this paper, we have presented a system that allows training for manipulation and assembly tasks. It is based on a Haptic Workstation TM, device which lends itself to bimanual assembly because of its dual exoskeleton. Also, we integrate the use of mixed reality environment that allows to interact with real and virtual objects at the same time. Moreover the Haptic Workstation TM, we used an optical see-through HMD and a powerful tracking system. The assembly task is improved by haptic guidance. We elaborated also a testing protocol that allowed to advance some recommandations when dealing with mixed-reality and haptic force feedback. Even with efficient tracking systems, mixed-reality techniques using optical see-through HMD are not enough precise to superpose correctly the virtual on the real world. The problem is that a small misalignment is acceptable when only the visual sense is stimulated. However, when combined with haptic force-feedback, the mixedreality world will be much more difficult to apprehend, because of kind of ghost effects. The user feels something that he does not see, or the opposite. This is comparable to the mechanism of pseudo haptic techniques: the visual channel could create haptic feedback. Thus, trying to reproduce realistically an assembly situation in a mixedreality with haptic feedback context will inevitably lead to a system that is difficult to use. But, in opposite, applying augmented haptic feedback to the user will improve the system usability. Acknowledgment This work has been supported by the Swiss National Science Foundation (FNS), and partially funded by the European Network of Excellence Intuition (NoE Intuition).