Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

Similar documents
FORCE FEEDBACK. Roope Raisamo

PROPRIOCEPTION AND FORCE FEEDBACK

The CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.

A Movement Based Method for Haptic Interaction

Force feedback interfaces & applications

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction

Computer Haptics and Applications

Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design

Haptic Rendering CPSC / Sonny Chan University of Calgary

2. Introduction to Computer Haptics

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

CS277 - Experimental Haptics Lecture 1. Introduction to Haptics

Benefits of using haptic devices in textile architecture

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Overview of current developments in haptic APIs

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Touching and Walking: Issues in Haptic Interface

AHAPTIC interface is a kinesthetic link between a human

Haptic interaction. Ruth Aylett

Haptic interaction. Ruth Aylett

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Abstract. 1. Introduction

Randomized Motion Planning for Groups of Nonholonomic Robots

Friction & Workspaces

AR 2 kanoid: Augmented Reality ARkanoid

1 Sketching. Introduction

A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

MAGNETIC LEVITATION SUSPENSION CONTROL SYSTEM FOR REACTION WHEEL

Pre-Activity Quiz. 2 feet forward in a straight line? 1. What is a design challenge? 2. How do you program a robot to move

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

A Virtual Environments Editor for Driving Scenes

Proprioception & force sensing

Shuguang Huang, Ph.D Research Assistant Professor Department of Mechanical Engineering Marquette University Milwaukee, WI

Collaborative Virtual Training Using Force Feedback Devices

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Phantom-Based Haptic Interaction

Beyond: collapsible tools and gestures for computational design

A Generic Force-Server for Haptic Devices

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1

College Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we

ENGINEERING GRAPHICS ESSENTIALS

Biomimetic Design of Actuators, Sensors and Robots

ROBOT DESIGN AND DIGITAL CONTROL

A Hybrid Actuation Approach for Haptic Devices

Haptic Virtual Fixtures for Robot-Assisted Manipulation

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Visual - Haptic Interactions in Multimodal Virtual Environments

Haptic Display of Contact Location

The Real-Time Control System for Servomechanisms

LDOR: Laser Directed Object Retrieving Robot. Final Report

Force display using a hybrid haptic device composed of motors and brakes

Artificial Neural Network based Mobile Robot Navigation

Performance Issues in Collaborative Haptic Training

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

Abstract. 2. Related Work. 1. Introduction Icon Design

Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou

TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

with MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation

Shape Memory Alloy Actuator Controller Design for Tactile Displays

Elastic Force Feedback with a New Multi-finger Haptic Device: The DigiHaptic

BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE

Robust Haptic Teleoperation of a Mobile Manipulation Platform

Classifying 3D Input Devices

Digital inertial algorithm for recording track geometry on commercial shinkansen trains

Robotic Vehicle Design

Motion of Robots in a Non Rectangular Workspace K Prasanna Lakshmi Asst. Prof. in Dept of Mechanical Engineering JNTU Hyderabad

Introduction to ANSYS DesignModeler

AutoCAD LT 2009 Tutorial

AutoCAD LT 2012 Tutorial. Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS. Schroff Development Corporation

Visuo-Haptic Interface for Teleoperation of Mobile Robot Exploration Tasks

An Introduction To Modular Robots

Multirate Simulation for High Fidelity Haptic Interaction with Deformable Objects in Virtual Environments

SDC. AutoCAD LT 2007 Tutorial. Randy H. Shih. Schroff Development Corporation Oregon Institute of Technology

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot

Peter Berkelman. ACHI/DigitalWorld

Fastener Modeling for Joining Parts Modeled by Shell and Solid Elements

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Via Stitching. Contents

Applying Model Mediation Method to a Mobile Robot Bilateral Teleoperation System Experiencing Time Delays in Communication

aspexdraw aspextabs and Draw MST

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Haptic Display of Multiple Scalar Fields on a Surface

ME Week 2 Project 2 Flange Manifold Part

Haptic Rendering and Volumetric Visualization with SenSitus

Elements of Haptic Interfaces

Comparison of Haptic and Non-Speech Audio Feedback

AutoCAD Tutorial First Level. 2D Fundamentals. Randy H. Shih SDC. Better Textbooks. Lower Prices.

TA Instruments RSA-G2 Dynamic Mechanical Analyzer

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

System Inputs, Physical Modeling, and Time & Frequency Domains

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Transcription:

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Anders J Johansson, Joakim Linde Teiresias Research Group (www.bigfoot.com/~teiresias) Abstract Force feedback (FF) is a technology which is very interesting in the context of human machine interfaces. This as it is usable as a haptic interface, which makes it possible to model and simulate objects and textures. One problem with haptic FF devices today is that most of them are of professional or research standard, which makes them expensive. The technology of FF have also been introduced in another context, that of home entertainment and computer games. Here there are a number of highvolume, (relatively) low fidelity products, available at a much lower price (around 1% of a professional system). We have investigated if one of these products, the Microsoft Sidewinder Force Feedback Pro, is possible to use in the context of visualization of objects. The chosen objects were labyrinths with stiff walls. The result is that it is possible, with some important limitations. 1 Introduction Force feedback (FF) is a technology which is very interesting in the context of human machine interfaces [1]. Haptic devices of today is often of high quality, but also expensive. Another class exists in the context of computer games. These devices have a low price, often good documentation and programming interfaces, but have a limited fidelity. We have tried one of these devices as a visualization tool for two-dimensional structures. The term visualization is in this text used as representing the process of the individual building an internal model of an object or structure. 2 Game Devices Game devices incorporating force feedback can be divided into three categories: vibration, wheels and joysticks. They are all net force displays, in that they mediate the virtual touch on an object by a tool, the tools being the handle of the joystick or the steering wheel. We can classify them in the number of degrees that they offer force feedback. A vibration device, which only conveys a vibration to the user, has a dimension of zero. One example is a traditional gamepad with the addition of a vibrating mechanism. One-dimensional FF devices have the possibility to vary the feedback according to the position of the input device in one dimension. A steering wheel mounted on a base is an example. The FF is applied to the rotation of the wheel, and can simulate G-forces, uneven road etc. Twodimensional devices are the most advanced available, and the most interesting. The most common is the joysticks, which have 2 degrees of freedom (DOF) and the FF is applied to both. This makes it possible to restrict the movement, exert forces or to apply waveforms to simulate different conditions. Professional systems often have three DOF, sometimes 6, and FF in at least 3 of them. These devices can simulate volumes, and not only objects in the plane to which we are constrained in the joystick. There also exists another class of equipment, which are related. That is platforms whose orientation can be adjusted relative the ground. This makes it possible to simulate G-forces by tilting the reference frame of the user, which will be recorded both by the vestibular system of the inner ear and the muscles of the body. These devices are not haptic devices by the definition of H. Z. Tan et.al. [2]. They are more correctly called adjustable frame devices. They are what is used in the popular motionride simulators at amusement parks. They can have everything from zero to six degrees of feedback. The two classes, force-feedback and adjustable-frame meet at the limit, the case of zero-degree feedback. The game pad with a vibrating actuator is a FF device when it is coupled to the pad as an input device, but could also be seen as an adjustable frame device, as the feedback is not tightly coupled with the buttons. One aspect that limits the fidelity of a FF device is the speed of the control loop. Some professional systems have the control in the host computer, and uses high-speed communication with the device. This makes it necessary to use a powerful host computer, as it must not only handle the

application program, but also the control loop of the FF device. The game devices have solved this by placing a simple co-processor in the device, which handles the control loop. This solution is presented by M. Ouhyoung et al. in [3]. It makes it possible to use low speed communication, as only a description of the control loop must be passed to the device, and this only when the loop parameters changes. It also makes it possible to use a simpler host computer. 3 The Labyrinth application Our test application is a visualization of labyrinths, or mazes. It has a number of different objects in its database, which are chosen from the joystick. The labyrinths range from simple examples to representations of complex historical labyrinths. The most complex is the garden maze of Versailles. All interaction with the program is done by the joystick and its buttons. All feedback from the program is done by FF. When the program is started, a maze is chosen by pressing the corresponding button on the base of the joystick. Then the handle is gripped, which is sensed by the computer. The program will now move the handle to the start of the labyrinth. From here the user is free to explore the structure, as he can feel the walls being simulated by FF in the joystick. When he finds the exit, this is signalled by an oscillation. If the handle is released and gripped once more, the user will be moved to the start point again. All structures is simulated in the absolute 2Dplane that the joystick handle moves in. The absolute position within the movement range of the handle is used as the desired position in the virtual structure. We have also developed a visual version of the program, where the user can see the structure and his position in it. This includes an utility in which it is possible to draw a structure with the mouse and then feel it with the joystick. This is a very useful tool in investigating the limits of the performance. 4 Experimental Hardware We used the Microsoft Sidewinder Force Feedback Joystick (figure 1). It has an onboard 16-bit processor running at 25 MHz. This processor handles all the force effects. The communication with the host-pc is done by the MIDI-interface, at a speed of 31 kbaud. This is at the limit to close the control loop at the PC and use it as the controller, as a good haptic presentation demands an update speed of ~1 khz [2]. Instead force effects are downloaded into the joysticks onboard memory and started by a separate command. Because the FF controlloop is closed in the joystick, the slow control channel from the PC does not lower the fidelity of the FF. Figure 1 The Microsoft Sidewinder FF Joystick The joystick supports a number of effects, from simple raw forces in an arbitrary direction to complex forcewaves and spatially located walls. These walls are what we have used in our implementation. They are placed in the joysticks plane by giving an angle (only 0, 90, 180 and 270 are supported), a distance and a facing. The co-processor then takes care of all the control, decides if the joystick is inside or outside the wall, and applies corresponding forces. Up to four walls are supported concurrently. The application is using the DirectX 5 software interface to the joystick [4]. 5 Algorithm In the labyrinth application we at each instant decides which four walls are active, that is, which wall in each direction will constrain the movement. We are using this high level representation when communicating with the co-processor. Only the placement of the walls are sent to it. As each wall is modeled in the co-processor as a stiff boundary, this is a plane-and-probe approach, according to W. Mark et. al. [5]. The co-processor handles the modeling of the constraining walls, aka. the planes, and updates the forces on the joystick handle, aka. the probe, at a high rate. The host computer updates the placement of the walls at a much lower rate. 5.1 Modeling The modeling of the users location consists of two points: a virtual location and a real location. The real location is the same as the position of the input device, i.e. the position that the user wants to be at. The virtual location is where the user is in the virtual world, which is constrained

by the structure to be visualized. The limit on the maximum force on the commercial joystick makes it possible for a normal person to place the joysticks handle in an arbitrary position regardless of the force being applied. When the virtual location and the real one is not the identical, all the objects in the database is traversed in order to find out which one or ones, if any, are constraining the movement. An example is given in figure 2. The user is in the location marked by the dot. The active walls, which will be sent to the joystick, are marked by grey. These are the instantaneous boundaries of the movement of the dot. Note that if the dot is moved into the corridor to the right, the upper and lower walls will be moved to represent the new constraints, even if the dot does not interfere with them. Figure 2 Example of wall placement in labyrinth. 5.2 Movement Constraints In order to handle the situation when the user have collided with a wall, we had to expand the model. Because of the limited maximum force capability, we had to handle the situation when the user deeply penetrates the wall. This is shown in figure 3. Previous location New location Desired location Constraining Line Figure 3 Wall constraint The dot marks the real location in the previous time-step. The cross marks the position of the real location. This movement of the virtual location is not permitted, as it would involve it crossing a line. The new virtual location is taken according to a rubber-band principle: It is placed as if the handle and the dot were connected by a spring, with the virtual location gliding at the frictionless surface of the line. That is: we are not taking the handles position as the users absolute location in the structure, but as the desired one.the user is dragging himself around in a elastic band around the structure. In the algorithm the new virtual location is taken as the projection of the desired location on the line. In the case of a straight line this is the point where the normal of the line coincides with the desired location. This approach is similar to the god object method in [6]. One initial problem was that we modeled all objects as mathematical lines and points. This would place the new virtual location exactly on the line, which would then not hinder the movement off the line in the next time-step. In this way the users virtual location tunneled through walls. The solution is to give the virtual location a size, in figure 3 represented by the size of the dot. In this way the new location is first placed on the line, and then bumped to the correct side by inflating a virtual balloon around it. The correct side is determined by comparing the distance to the previous virtual location for new locations bumped to both sides of the line, and choosing the one with the shortest distance. This approach is different to the one which models the softness in the FF due to insufficient maximum force capability as soft objects. If we were to do that, the virtual location would deform the line, and would be able to go through an inside corner consisting of two lines, by pushing them to the side. 5.3 Collision Detection The collision detection is made at every time-step: each line object is asked if it limits the movement of the virtual location to the desired location. The real location is taken as the desired location in the evaluation of the first object. It calculates if it constraints the movement. If it does, it calculates two output data: a new valid desired location taken as the projection of the old one on the line, and the distance between the previous virtual location and the constraining line. This data is then given to the next line object. It does the same calculations, but only changes the valid desired location if it does constrain the movement and the distance calculated is less than the previous smallest one. After all of the line objects have done this calculations, the process is iterated until no line object changes the position. The new valid desired location is then taken as the new virtual location. This process is repeated at a speed of 20 Hz, which is the frequency of updated real locations from the joystick and new walls sent to it. This approach does not use any bounding spheres and boxes as in [7] and [8].

Figure 4 Handling of corners In figure 4 is shown the special case of endpoints of constraining lines. Here the straight movement of the virtual location is not legal, as it crosses the line. A intermediate virtual location is taken as the projection of the desired location on the line. As the line ends before it reaches this projection point, we take the endpoint as the intermediate location. The next iteration of the movement gives a valid movement as a line from the intermediate location to the desired one does not cross the constraining line. In this way the virtual location snaps around the end of the line. 5.4 Slanted Lines As the hardware only supports walls at angles of 0, 90, 180 and 270 degrees, we had to approximate slanted lines with two orthogonal ones. (a) Previous virtual location (b) Intermediate location Desired location Figure 5 Approximation of slanted line The dot in figure 5(a) is not constrained by the walls placed to simulate the slanted line. But when the dot is at the constraining line, the walls are placed to simulate the constraint. If the dot is moved along the line, it will have to penetrate the wall a little. The new real location gives a movement of the virtual location that is permitted, and the walls will be moved. The effect of this will be a texture on the surface. This texture will have a dependence on direction, in the same way as velvet. In figure 5b the movement to down-right will be opposed by an ever moved vertical wall, to the upper-left by a horizontal one. system had pre-implemented the wall objects, we could not adjust the behavior of these according to known solutions [9]. Instead we added a viscosity to the whole world, simulating the user dragging around his position in a liquid by a rubber band. As the viscosity dampen the movement, the problem of the user-induced oscillations is diminished. 6 Results The low fidelity consumer grade equipment is useful for visualization tasks. When the high-level representation of objects is used instead of forces, the slow communication channel between the host computer and the device induce no degradation of performance. The more complex shapes in our test application are hard to visualize from the tactile sense only. This as we work within a fixed area, which translates increased complexity of the structure to decreased feature size. These very small structures are blurred by the small force capability from the joystick, as the handle is always penetrating them to a not negligible degree when the user feels the contact force from them. 7 Conclusion The possibilities of using low cost force feedback hardware for visualization tasks are good. The hardware has limitations in comparison with commercial research grade equipment. These limitations are mostly due to limited maximum force capability. Methods of compensating this have been described. The biggest difference to the higher grade systems is the reduced dimensionality of the device: 2D instead of 3D. But if the visualization task can be translated to the 2D domain, the limited devices are a valid option. Acknowledgments We want to thank the Microsoft Hardware Group for supplying us with hardware and software. We also want to thank Certec at Lund University who have supplied the computer resources necessary for the project. 5.5 Rigidity We had to stabilize the system, as the user easily can induce a oscillation between the walls in an narrow corridor due to the limited maximum force of the walls. As the

References [1] M. A. Srinivasan, C. Basdogan, Haptics in Virtual Environments: Taxanomy, Research Status, and Challenges, Comput. & Graphics, Vol. 21, No. 4, pp. 393-404, 1997 [2] H. Z. Tan, VB. Eberman, M. A. Srinivisan, B. Cheng, Human Factors for the Design of Force-reflecting Haptic Interfaces, Dynamic Systems and Control, Vol. 55-1, Book No. G0909A-1994 [3] M. Ouhyoung et.al., A Low-Cost Force Feedback Joystick and its use in PC Video Games, IEEE Trans. on Consumer Electronics, Vol. 41, No. 3 August 1995 [4] B. Bargen, P. Donelly. Inside DirectX, Microsoft Press, 1998, ISBN 1-57231-696-9 [5] W. R. Mark, S. C. Randolph, M. Finch, J. M. Van Verth, R. M. Taylor II, Adding Force Feedback to Graphics Systems: Issues and Solutions, Computer Graphics Proceedings, Annual Conference Series, 1996, ACM SIGGRAPH [6] C. B. Zilles, J. K. Salisbury, A Constraint-based god-object Method for Haptic Display, IEEE International Conference on Intelligent Robots and Systems, 1995 [7] D. C. Ruspini, K. Kolarov, O. Khatib, Robust HAptic Display of Graphical Environments, The First PHANToM Users Group workshop, eds. J.K. Salisbury and M.A.Srinivisan, Dedham, MA, Sept. 1996 [8] D. K. Pai, L.-M. Reissel, Haptic Interaction with Multiresolution Image Curves, Comput. & Graphics, Vol. 21, No. 4, 1997 [9] T. Massie, Taking the Mush Out of HAptics with Infinitely Stiff Walls,The First PHANToM Users Group workshop, eds. J.K. Salisbury and M.A.Srinivisan, Dedham, MA, Sept. 1996