Phantom-Based Haptic Interaction

Similar documents
Computer Haptics and Applications

Force feedback interfaces & applications

Abstract. Introduction. Threee Enabling Observations

PROPRIOCEPTION AND FORCE FEEDBACK

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Haptic interaction. Ruth Aylett

FORCE FEEDBACK. Roope Raisamo

Haptic interaction. Ruth Aylett

¾ B-TECH (IT) ¾ B-TECH (IT)

HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS

Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design

Touching and Walking: Issues in Haptic Interface

Haptic Display of Contact Location

International Journal of Advanced Research in Computer Science and Software Engineering

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Proprioception & force sensing

Benefits of using haptic devices in textile architecture

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Haptic Rendering CPSC / Sonny Chan University of Calgary

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

Haptic presentation of 3D objects in virtual reality for the visually disabled

Haptics CS327A

Novel machine interface for scaled telesurgery

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Overview of current developments in haptic APIs

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

Differences in Fitts Law Task Performance Based on Environment Scaling

MEAM 520. Haptic Rendering and Teleoperation

Haptic Feedback to Guide Interactive Product Design

Visual - Haptic Interactions in Multimodal Virtual Environments

Evaluation of Five-finger Haptic Communication with Network Delay

MEAM 520. Haptic Rendering and Teleoperation

Comparison of Haptic and Non-Speech Audio Feedback

Applications of Haptics Technology in Advance Robotics

Haptic Technology- Comprehensive Review Study with its Applications

Surgical robot simulation with BBZ console

Elements of Haptic Interfaces

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

2. Introduction to Computer Haptics

Lecture 1: Introduction to haptics and Kinesthetic haptic devices

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Force Feedback Mechatronics in Medecine, Healthcare and Rehabilitation

Methods for Haptic Feedback in Teleoperated Robotic Surgery

COPYRIGHTED MATERIAL. Overview

Medical Robotics. Part II: SURGICAL ROBOTICS

COPYRIGHTED MATERIAL OVERVIEW 1

ERGOS: Multi-degrees of Freedom and Versatile Force-Feedback Panoply

Medical robotics and Image Guided Therapy (IGT) Bogdan M. Maris, PhD Temporary Assistant Professor

FALL 2014, Issue No. 32 ROBOTICS AT OUR FINGERTIPS

Haptics Technologies: Bringing Touch to Multimedia

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Virtual Experiments as a Tool for Active Engagement

Haptic Technology: A Touch Revolution

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Virtual Sculpting and Multi-axis Polyhedral Machining Planning Methodology with 5-DOF Haptic Interface

Wearable Haptic Feedback Actuators for Training in Robotic Surgery

The Effect of Haptic Degrees of Freedom on Task Performance in Virtual Surgical Environments

ABSTRACT. Haptic Technology

Virtual Environments. CSCI 420 Computer Graphics Lecture 25. History of Virtual Reality Flight Simulators Immersion, Interaction, Real-time Haptics

Learning From Where Students Look While Observing Simulated Physical Phenomena

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

4/23/16. Virtual Reality. Virtual reality. Virtual reality is a hot topic today. Virtual reality

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

1/22/13. Virtual Environments. Virtual Reality. History of Virtual Reality. Virtual Reality. Cinerama. Cinerama

Haplug: A Haptic Plug for Dynamic VR Interactions

Using Web-Based Computer Graphics to Teach Surgery

R (2) Controlling System Application with hands by identifying movements through Camera

A haptic rendering system for virtual handheld electronic products

Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Shape Memory Alloy Actuator Controller Design for Tactile Displays

Robot Task-Level Programming Language and Simulation

Six d.o.f Haptic Rendered Simulation of the Peg-in- Hole Assembly

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Haptic Rendering and Volumetric Visualization with SenSitus

Using Haptics to Improve Immersion in Virtual Environments

SMart wearable Robotic Teleoperated surgery

A Movement Based Method for Haptic Interaction

Realistic Force Reflection in the Spine Biopsy Simulator

DESIGN OF A 2-FINGER HAND EXOSKELETON FOR VR GRASPING SIMULATION

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Haptic holography/touching the ethereal Page, Michael

Virtual Environments. Ruth Aylett

Chapter 1 Introduction

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

A Hybrid Actuation Approach for Haptic Devices

Haptic Holography/Touching the Ethereal

Virtual Reality Based Training to resolve Visio-motor Conflicts in Surgical Environments

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Physical Presence in Virtual Worlds using PhysX

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1

College Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we

Transcription:

Phantom-Based Haptic Interaction Aimee Potts University of Minnesota, Morris 801 Nevada Ave. Apt. 7 Morris, MN 56267 (320) 589-0170 pottsal@cda.mrs.umn.edu ABSTRACT Haptic interaction is a new field of research that adds the sense of touch to virtual environments. Users are given the illusion that they are touching or manipulating a real physical object. Some haptic devices include joysticks, touch mice, and gloves with embedded sensors. The Phantom haptic interface is a creation of J. Kenneth Salisbury and Thomas Massie at the Massachusetts Institute of Technology. The phantom haptic interface has a pen (stylus), or a fingertip thimble, which can be replaced by other tools depending on the application. Most phantom haptic interfaces use three degrees of freedom, which makes the force-feedback system more precise giving it advantages over other haptic devices. Collision detection is an important issue for the interaction between the thimble or stylus and the object. Applications include medical simulation for training in surgical procedures and three-dimensional (3D) paint or 3D clay modeling for designers. Keywords Haptics, haptic interface, haptic interaction, force feedback. 1. INTRODUCTION The word haptic is from the Greek work haptesthai meaning to touch. The international scientific vocabulary defines haptic as relating to or based on the sense of touch or characterized by a predilection for the sense of touch. Human haptics is defined as the human touch perception [8]. Some haptic devices include joysticks, mice and gloves that all provide the user with some kind of sensation. The addition of haptics to virtual environments has provided computer users with the ability of expression in multiple dimensions. Just as there is importance placed on touch in the physical world, in the virtual world, touch has been noticeably absent. Before haptics, computer users could only visualize the virtual objects. For example, if a ball was hit in the virtual Permission is granted to make copies of this document for personal or classroom use. Copies are not to be made or distributed for profit or commercial purposes. To copy otherwise, or in any way publish this material, requires written permission. environment the user could see the ball was hit, but could not feel it. New users of the high technology haptic interfaces are very surprised and intrigued with the reality of the their experiences. A blind user was fooled when he was able to touch a virtual object. He examined the virtual object s surface with his finger and was not surprised at all until he was reminded that there was no physical object present. This startled him because he jumped and started reaching out for the nonexistent object with his other hand [8]. Another example that shows the reality of the experience is the demonstration of a medical procedure. A needle biopsy is a procedure in which a doctor inserts a long needle into the brain. When this procedure has been demonstrated using a haptic device, many doctors' reactions are that the needle seems a bit dull [7]. This means the doctors are more concerned with the procedure than they are concerned that it is only a simulation. The reality of the simulation is also shown when there is a sudden removal of a certain haptic device simulation. It is explained as being similar to when a person who is going to sit down is unaware that the chair has been pulled out. Haptic interactions give the user the illusion that they are dealing with real, physical objects. Interactions to this extent of reality in this new field are motivation for research of this topic. In this paper, I will concentrate on the phantom haptic device. Although there are several different phantom haptic device versions, I will be talking about phantom haptic devices in general, unless I mention one specifically. The phantom s mechanics, advantages to other haptic devices, applications and problems/limitations will be explained in more detail. The paper begins with a brief history of haptics and continues into the description of the phantom haptic interface. 2. PHANTOM HAPTIC INTERFACE In the past, people used devices that would give them a sense of touch. Simple tongs, mechanical hands, and levers were used to control remote actions such as pouring hot liquids, or grabbing flasks. In the 1940 s manipulation systems were designed for handling hazardous waste and nuclear materials and are still used today [9]. Designer Knoll, at Bell Labs in the 1960 s was one of the first to demonstrate the touching of a virtual shape using a computer haptic interface [9]. Since then researchers have continued to design devices while keeping in mind that they will greatly enhance 3D visualizations.

In 1993, J. K. Salisbury and Thomas Massie of SensAble Technologies introduced and commercialized the Personal Haptic Interface Mechanism, the PHANTOM. The Phantom haptic interface began the new field of research called computer haptics. It is defined as the discipline concerned with the techniques and processes associated with generating and displaying synthesized haptic stimuli [9]. The phantom allows the user to interact with a variety of virtual objects. The device exerts an external force on the computer user with force feedback that gives the illusion of interaction with solid physical objects. The phantom is an electromechanical desktop device that connects to the computer's input/output port [8]. The user inserts a finger into a thimble or holds a stylus supported by a mechanical arm. The thimble or stylus will then track the motions and position of the user s fingertip while applying forces on the user. The phantom system is controlled by three direct current (DC) motors that have sensors and encoders attached to them. The number of motors corresponds to the number of degrees of freedom a particular phantom system has, although most systems produced have 3 motors. The encoders track the user s motion or position along the x, y and z coordinates and the motors track the forces exerted on the user along the x, y and z axis. From the motors there is a cable that connects to an aluminum linkage which connects to a passive gimbal which attaches to the thimble or stylus. A gimbal is a device that permits a body freedom of motion in any direction or suspends it so that it will remain level at all times. As explained later in the paper, because the three degrees of freedom meet at one contact point, no torque is measured, only force applied to the point. Friction and inertia must be constant to limit distractions of the user [6]. Also, the haptics system must be able to analyze and sense the forces applied by the user and then deliver the sensation back in real time. The phantom was designed under a few important considerations, first among them being: Force and motion. In the physical world we impose forces on ourselves whenever we touch anything. These forces and the position and motion of our hand and arms are transmitted to the brain as kinesthetic information [9]. This information along with cutaneous (touch) senses, force and motor capabilities are what allow us to touch and manipulate objects and relate them to the space around us. The phantom haptics system must also be able to interpret force and motion information. It must be able to determine how objects move when forces are applied and also determine the geometry of the object (texture and friction of the surface of the object). Events tracking the change in position or motion of the probe, collision detection between the object and another object or the probe, explained later in the paper, are all important. The Phantom was designed with three degrees of freedom because very little torque (twisting-rotating) is involved with either the thimble or the stylus. Degrees of freedom are the directions the user can move in. For a user to touch all sides of a virtual 3-dimensional object the haptics system needs 3 degrees of freedom. Another 3 degrees of freedom are needed if a user wants to rotate the object freely. Because the first Phantom haptic interface that was created uses only 3 degrees of freedom, it allows the system to model those 3 degrees of freedom as a point contact in the virtual environment. This simplifies programming because with a point contact there is little torque, therefore it is less complex. These considerations were combined into three main criteria to attain a balanced, effective system. The first criterion is the user of the phantom must feel free in virtual space. There cannot be any external forces present and there must be low inertia and little friction. The Phantom system s friction is measured at less than.1 Newton (Nt). For inertia a user feels no more than 100 grams of mass and the unbalanced weight is less than.2 Nt at all points in the workspace [6]. The second criterion is that the virtual objects must be perceived as stiff. The virtual object or surface can only be as stiff as the control algorithm allows it to be. The maximum stiffness is about 35 Nt/cm. Although according to Massie and Salisbury, most users will be convinced of a stiff surface at 20 Nt/cm [6]. Sound is also a factor in the stiffness of virtual objects. If a user touches a hard surface and they hear a knock; the user is likely to accept the surface as stiff. The third criterion is that virtual walls must be solid or immovable to the user. This means that the force exerted by the user must be counteracted by the phantom system. The maximum force the system can exert is only 10 Nt of force. However, it has been shown that during precise manipulation a user exerts 10 Nt or less of force, but on average a user only exerts 1 Nt of force, while maximum continuous force capability for the phantom is 1.5 Nt [6]. Therefore, the system is capable of responding to regular manipulation activities. The phantom needs to match the human sensory, motor and cognitive systems. The system does not have to completely replicate a normal human being but the touch sense is harder to replicate than vision or hearing [9]. A good example is that to match human vision only 30 to 60 frames per second are needed for the viewer to believe constant motion. Human touch is far more sensitive and therefore the motors information must be updated 1, 000 times per second to provide a continuous feeling. 2.1 Collision Detection There are two other important issues with haptic interaction. The first is collision detection between the end point of the probe and the objects. The second is collision response to the detection, meaning how much force is reflected to the user [2]. A collision algorithm must reduce computation time and give a realistic response to the user. I will explain and give an efficient Neighborhood Watch algorithm, for collision detection and response. The first step is to make a hierarchical tree structure to store the properties of the geometrical object (indexed triangles) [2]. These objects are made up of 2-dimensional primitives or polygons, lines and vertices. Each primitive has neighboring primitives, for example, polygons share neighbors' lines and vertices. Also, each primitive has a normal vector. A polygon has a normal vector that is perpendicular to its surface, a line s normal vector is the average of normal vectors of its neighboring polygons and a vertex s normal vector is the average of neighboring polygons relating to the angle at that vertex. The endpoint of the probe is called the HIP (Haptic Interface Point) and the IHIP is the Ideal Haptic Interface Point [2]. The IHIP is needed because we use the distance from the IHIP to the current HIP to find the depth of penetration of the probe. When the HIP acts, the IHIP acts also. If the HIP is outside the object,

so is the IHIP, and if the HIP penetrates the object, the IHIP will remain on the surface. A line segment between the previous and new coordinates of the HIP is used to track its path, and determine if it has penetrated an object. To determine collisions fast the polygons are hierarchically partitioned until each of the polygons are in their own bounding boxes. A bounding box is a box placed around the object so that it contains all parts in its interior. First, the algorithm checks if collisions are detected between the line segment and the bounding box of the object [2]. If the line segment is inside the bounding box a collision was detected, meaning the probe has come near the object and has collided with its bounding box. The collisions are then checked with each partitioned bounding box, starting at the top of the tree. The final collision check is between the line segment and the polygon itself at the lowest level of the tree. If the line segment penetrates the polygon we set it equal to the first contacted geometric primitive. We then calculate the closest distance from the current HIP to the contacted geometric primitive, which could be a polygon, line or a vertex. If it is a polygon, then we check the distances from the current HIP to the polygon s neighboring points and lines. The primitive with the shorter distance to the current HIP is then set to the new contacted primitive, while the IHIP is moved to a point located on this primitive that is closest to the current HIP. This is repeated for each interaction. A vector from the HIP to the IHIP is needed to determine if the HIP is still inside the object. If the dot product calculated between this vector and the normal vector is positive then the HIP is outside the object and if the value is negative the HIP is still inside the object. The magnitude of the vector is used to calculate forces applied to the object by the probe. This is the pseudo-code for the algorithm just described in the paragraphs above. Comments are written for better understanding. if (collision == false) There is no collision and there is penetration of the object. if (the path of HIP penetrates a polygon) Then set this polygon as the contacted primitive and move the IHIP to a point on the surface that is nearest to the HIP collision. else repeat There is a collision and there is no penetration. Set the contacted primitive (primitive touched at first contact) to the contacted primitive. Set the contacted primitive to primitive1. Set the closest distance from the current HIP to primitive1 to distance1. Repeat for each movement of the probe. Set the contacted primitive to primitive1. for (i=1 to the number of neighboring primitives of primitive) Set the i th neighboring primitive of primitive1 to primitive2. Set the distance from the current HIP to primitive2 equal to distance2. if (distance2 < distance1) If the distance2 is less than the distance1 then, set primitive2 to the contacted primitive and set distance2 to distance1. This means that if the ith neighbor of primitive1 or primitive2 is closer to the current HIP than primitive1 is, then the probe has moved. This is determined by the shortest distance from where the probe was previously to the current HIP. Do this loop for each neighboring primitive of primitive1. while (primitive!= contacted geometric primitive) While the primitive is not equal to the contacted primitive, then move the IHIP to a point nearest to the contacted primitive to the current HIP. Set the vector to vector1. Set the normal vector of the contacted primitive to normal1. if (dot product (vector1, normal1) >0) If the dot product of the vector1 and the normal is greater than zero, then there is a collision and no penetration. else If the dot product is less than zero then, there is no collision and there is penetration. Set vector1 to the penetration vector. Set the magnitude of the penetration vector to the penetration depth. The purpose of this Neighborhood Watch algorithm is to check for collisions between objects and between the probe and objects. This efficient technique can handle both convex and concave objects, therefore resulting in more stable haptic interactions with more complex objects and with a reduction in the computational time [2]. 2.2 Advantages of the Phantom Interface The way that the phantom haptic interface has been designed gives it advantages over other haptic devices. Its size resembles a small desk lamp and its workspace is about the size of a mouse pad. The size gives the user the ability to work with the device on their desktop while still having enough workspace to use it freely. Exoskeletal devices do not allow this much freedom of motion while at the same time having high fidelity. For example, gloves provide more degrees of freedom but with less precision. The system operates on point contact and has much higher

fidelity; therefore it can be used for highly technological applications. The phantom setup allows the stylus or thimble to function as surgical tools, paint brushes or other tools depending on the application. Other devices do not allow for such wide ranges of use. 2.3 Applications of the Phantom Interface One of the greatest advantages of the Phantom haptic interface is that it has a wide variety of applications. One of the first broad applications is in training people to perform real world tasks. In the field of medicine, touch is an important sense. It has been one of the most researched topics in haptics. Medical students need to train in performing procedures usually done on live patients, gaining skill as time goes on. The phantom provides these students with the ability to train on surgical simulators. This reduces the training time of the students and allows them to train on more complex operations before actually operating. The simulation could be recorded and later observed for evaluation or skill level verifications on the procedure. The surgery can also be recorded so that the student can feel the doctor s prerecorded procedure. Boston Dynamics Inc. is a large company that designs and commercializes surgical simulators. Their simulation is used for either arthroscopy of the knee, anastomasis (suturing of tube-like organs), or the treatment of limb trauma [3]. The user stands just above the operation, just as in a normal surgery, and looks down at a 3D visualization. The user sees the life-like operation and the surgical tool being used on the screen. This system setup requires two phantom haptic interfaces and the cost is near $175,000 [3]. More testing must be done to evaluate the performance of this simulator before the medical field will accept it. It must reduce learning time with greater success rates and at a lower cost. The phantom is ideal for minimally invasive surgeries like laparoscopy and arthroscopy in which the doctors must insert long tools with cameras to view the operation. In these procedures there is no direct contact. The phantom s setup, precision and high fidelity can greatly enhance the quality of these surgeries. The Center for Human Simulation at the University of Colorado Heath Center is researching haptic simulation surgeries on the knee and eye based on highly detailed life-like models [9]. Researchers at Millersville University and Penn State University College of Medicine in Pennsylvania are working on a haptic surgical suturing simulator. Their first goal is to be able to measure, using the simulator, the difference in skill of expert surgeons and novice surgeons. Researches also want the system to be cost efficient and effective so that it will be available to anyone. The hardware used for this system is a Phantom 1.5 desktop unit. The simulator uses the Phantom haptic device which is positioned upside down. Students can feel the soft tissue surrounding a wound and the forces when suturing (pulling and pushing of the needle through the skin). It uses a Windows NT workstation, and the students are given special glasses that give stereographic imagery, while looking down at the patient. While their current model is suitable to teach and measure the students suturing skills, in the future, they are concerned with adding twist constraints or torque in a six degrees of freedom model. 3D modeling or clay modeling gives users the ability to work with a virtual surface or ball made out of digital clay. Complex shapes can be created or manipulated. The difficulty of the interaction with the physical world and digital world has kept many designers working with the more familiar, real clay models. The first, major computer animation movie Toy Story began to change this, but still the designers modeled with clay before digitizing [5]. The digitization process is very difficult due to possible errors. For this reason modelers want a new system. Industrial designers and modelers can benefit from the application because 3D modeling packages have many benefits over classic software for four main reasons. Touch providing feedback helps to position the object correctly in 3D space, it helps make the visualization clearer by letting the user feel the models, and assists in the communication of the physical properties of the model [5]. The fourth reason is that force feedback lets users continuously manipulate the objects [5]. The programs will then let the designers work with more creativity. SensAble Technologies has sold over 500 phantom devices to major companies including Hasbro, Boeing and Honda [1]. At Adidas, designers sculpt shoes using SensAble s FreeForm. Free Form is a new version of the phantom that allows simulation clay modeling. Adidas expects to design prototypes more easily, therefore reducing the time spent by as much as 80%. SensAble Technologies has also designed a 3D Touch System. It is a 3D modeling system that includes the phantom hardware interface, GHOST (General Haptics Open Software Toolkit) software and touch enabled 3D applications. Many designers use a program called NURBS to manipulate 3D images. NURBS (Nonuniform Rational B-Splines) is a computer graphics program that requires users to move control points that only roughly correspond to the shape of the model s surface [8]. The control points of this program are manipulated by the 2D keyboard or mouse. Modelers find it hard, however, to estimate the positions and amounts of changes made to the model. 3D paint involves switching the phantom haptic device stylus to function as a paintbrush. The user would then apply realistic textures to the physical model. The forces would be consistent with the forces needed for a user to reach out and paint. It would give the user freedom and more creativity on the 3D visualization than other 3D paint programs allow. In 1998, Interactive Effects in Tustin, California announced that it had integrated SensAble s 3D-Touch System with their Amazon 3D Paint 3.0 package [3]. With the new force feedback version artists can actually make their brushes flare or feather depending on how much pressure is applied. The user can rotate the brush functioning stylus freely and can feel varying textures of the surface being painted and also the viscosity of the paint. 2.4 Problems and Limitations of the Phantom There are certain problems and limitations present with haptic interfaces. Until recently, research has concentrated on the devices themselves. Now however, software developers are working to meet the needs of these haptic devices. For most haptic interfaces the software is included in the complete setup. Haptic models require much more computing power than computer graphics programs. Haptic options have not been designed into the software and it is nearly impossible to integrate haptic features

into existing software packages. This makes it difficult because new software programs must be written. Device configurations are not all the same, so acceptable interfaces and control standards must be agreed upon [3]. Otherwise, software developers may not know if their applications will operate the same on the devices they were designed for. As mentioned earlier, for the phantom to operate smoothly without vibrations the motors must constantly update their information. It is very difficult to update, control and display the visualization at the same time. As a result, if a person tries to operate the phantom too quickly and too much of the image is required to change, then the image will lag behind the motions of the user [4]. Many applications need multiple degrees of freedom, which is very complex. This is complicated by the need for real-time applications. Single point contact is nowhere near as powerful as whole hand haptic interaction, however, more research is needed because of its complexity. Price is certainly a limiting factor. About 10 years ago the cost of the phantom haptic interface was near $40,000. By 1997 the cost had dropped to about $20,000 and today the phantom s price is stands at $10,000 to $15, 000. Until the price drops significantly, it will continue to deter general acceptance. 3. CONCLUSION There are several variables to consider in designing a haptic interface that make it a very complex effort. Haptic interaction is just beginning as a field in computer science and will continue to grow along with three-dimensional visualizations. With many new phantom haptic devices being sold to industrial companies and other new devices such as touch mice, haptics will soon be a part of a person s normal computer interaction. Many want to increase commercial use, with lower cost applications and these will most likely begin with tactile games and desktop systems. haptics is a new field, I believe it will come to have a much greater role in commercial applications in the years to come. 4. REFERENCES [1] Gallagher, Leigh, Feel Me, Forbes, 278 (March-2000). [2] Ho, Chih-Hao, C. Basdogan and M. A. Srinicasan, Efficient Point-Based Rendering Techniques for Haptic Display of Virtual Objects, Presence, 8(5) 447-491 (October, 1999). [3] Hodges, Mark, It Just Feels Right. Computer Graphics World, (October, 1998). [4] Mahoney, Diane, P., The power of touch, Computer Graphics World, 20(8), 41-45 (August, 1997). [5] Massie, Thomas, A Tangible Goal for 3D Modeling, IEEE Computer Graphics and Applications, 18(3), 62-65 (May-June, 1998). [6] Massie, Thomas, H. and J. K. Salisbury, The PHANTOM Haptic Interface: A Device for Probing Virtual Objects, Proceedings of the ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, IL (November, 1994). [7] Salisbury, J. K., Haptics-The Technology of Touch, HPCwire, http://www.sensable.com/community/haptwhpp.htm (November, 1995). [8] Salisbury, J. K. Jr., Making Graphics Physically Tangible, Communications of the ACM, 42(8), 75-81 (August, 1999). [9] Salisbury, J. K. and M. A. Srinivasan, Phantom-Based Haptic Interaction with Virtual Objects, IEEE Computer Graphics and Applications, 17(5), 6-10 (September-October, 1997). In the future, it could be possible to have virtual rooms where about 10 to 40 people gather to share in graphic and haptic displays. With one or two people in one virtual room, a class could explore molecules, sculpt in a ceramics class or participate in any class that involves models that need to be manipulated, passed around or touched [9]. Demonstrations of multiple finger interactions are expected with or without multiple user interactions with haptic devices. SensAble Technologies will deal with stability during tasks that involve two fingers to grasp objects and with tool interactions such as screwdrivers and pliers. The future of haptics depends not only on technical advances but also the understanding of how to convey the information correctly and precisely. In this paper, I have explained the Phantom haptic interface and the collision detection pseudo-code to understand the haptic interactions at a lower level. I have showed the many advantages and applications of the phantom system that far outweigh the drawbacks of the limiting factors and problems. Even though