Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.
|
|
- Valentine Robbins
- 5 years ago
- Views:
Transcription
1 Using Simple Force Feedback Mechanisms as Haptic Visualization Tools. Anders J Johansson, Joakim Linde Teiresias Research Group ( Abstract Force feedback (FF) is a technology which is very interesting in the context of human machine interfaces. This as it is usable as a haptic interface, which makes it possible to model and simulate objects and textures. One problem with haptic FF devices today is that most of them are of professional or research standard, which makes them expensive. The technology of FF have also been introduced in another context, that of home entertainment and computer games. Here there are a number of highvolume, (relatively) low fidelity products, available at a much lower price (around 1% of a professional system). We have investigated if one of these products, the Microsoft Sidewinder Force Feedback Pro, is possible to use in the context of visualization of objects. The chosen objects were labyrinths with stiff walls. The result is that it is possible, with some important limitations. 1 Introduction Force feedback (FF) is a technology which is very interesting in the context of human machine interfaces [1]. Haptic devices of today is often of high quality, but also expensive. Another class exists in the context of computer games. These devices have a low price, often good documentation and programming interfaces, but have a limited fidelity. We have tried one of these devices as a visualization tool for two-dimensional structures. The term visualization is in this text used as representing the process of the individual building an internal model of an object or structure. 2 Game Devices Game devices incorporating force feedback can be divided into three categories: vibration, wheels and joysticks. They are all net force displays, in that they mediate the virtual touch on an object by a tool, the tools being the handle of the joystick or the steering wheel. We can classify them in the number of degrees that they offer force feedback. A vibration device, which only conveys a vibration to the user, has a dimension of zero. One example is a traditional gamepad with the addition of a vibrating mechanism. One-dimensional FF devices have the possibility to vary the feedback according to the position of the input device in one dimension. A steering wheel mounted on a base is an example. The FF is applied to the rotation of the wheel, and can simulate G-forces, uneven road etc. Twodimensional devices are the most advanced available, and the most interesting. The most common is the joysticks, which have 2 degrees of freedom (DOF) and the FF is applied to both. This makes it possible to restrict the movement, exert forces or to apply waveforms to simulate different conditions. Professional systems often have three DOF, sometimes 6, and FF in at least 3 of them. These devices can simulate volumes, and not only objects in the plane to which we are constrained in the joystick. There also exists another class of equipment, which are related. That is platforms whose orientation can be adjusted relative the ground. This makes it possible to simulate G-forces by tilting the reference frame of the user, which will be recorded both by the vestibular system of the inner ear and the muscles of the body. These devices are not haptic devices by the definition of H. Z. Tan et.al. [2]. They are more correctly called adjustable frame devices. They are what is used in the popular motionride simulators at amusement parks. They can have everything from zero to six degrees of feedback. The two classes, force-feedback and adjustable-frame meet at the limit, the case of zero-degree feedback. The game pad with a vibrating actuator is a FF device when it is coupled to the pad as an input device, but could also be seen as an adjustable frame device, as the feedback is not tightly coupled with the buttons. One aspect that limits the fidelity of a FF device is the speed of the control loop. Some professional systems have the control in the host computer, and uses high-speed communication with the device. This makes it necessary to use a powerful host computer, as it must not only handle the
2 application program, but also the control loop of the FF device. The game devices have solved this by placing a simple co-processor in the device, which handles the control loop. This solution is presented by M. Ouhyoung et al. in [3]. It makes it possible to use low speed communication, as only a description of the control loop must be passed to the device, and this only when the loop parameters changes. It also makes it possible to use a simpler host computer. 3 The Labyrinth application Our test application is a visualization of labyrinths, or mazes. It has a number of different objects in its database, which are chosen from the joystick. The labyrinths range from simple examples to representations of complex historical labyrinths. The most complex is the garden maze of Versailles. All interaction with the program is done by the joystick and its buttons. All feedback from the program is done by FF. When the program is started, a maze is chosen by pressing the corresponding button on the base of the joystick. Then the handle is gripped, which is sensed by the computer. The program will now move the handle to the start of the labyrinth. From here the user is free to explore the structure, as he can feel the walls being simulated by FF in the joystick. When he finds the exit, this is signalled by an oscillation. If the handle is released and gripped once more, the user will be moved to the start point again. All structures is simulated in the absolute 2Dplane that the joystick handle moves in. The absolute position within the movement range of the handle is used as the desired position in the virtual structure. We have also developed a visual version of the program, where the user can see the structure and his position in it. This includes an utility in which it is possible to draw a structure with the mouse and then feel it with the joystick. This is a very useful tool in investigating the limits of the performance. 4 Experimental Hardware We used the Microsoft Sidewinder Force Feedback Joystick (figure 1). It has an onboard 16-bit processor running at 25 MHz. This processor handles all the force effects. The communication with the host-pc is done by the MIDI-interface, at a speed of 31 kbaud. This is at the limit to close the control loop at the PC and use it as the controller, as a good haptic presentation demands an update speed of ~1 khz [2]. Instead force effects are downloaded into the joysticks onboard memory and started by a separate command. Because the FF controlloop is closed in the joystick, the slow control channel from the PC does not lower the fidelity of the FF. Figure 1 The Microsoft Sidewinder FF Joystick The joystick supports a number of effects, from simple raw forces in an arbitrary direction to complex forcewaves and spatially located walls. These walls are what we have used in our implementation. They are placed in the joysticks plane by giving an angle (only 0, 90, 180 and 270 are supported), a distance and a facing. The co-processor then takes care of all the control, decides if the joystick is inside or outside the wall, and applies corresponding forces. Up to four walls are supported concurrently. The application is using the DirectX 5 software interface to the joystick [4]. 5 Algorithm In the labyrinth application we at each instant decides which four walls are active, that is, which wall in each direction will constrain the movement. We are using this high level representation when communicating with the co-processor. Only the placement of the walls are sent to it. As each wall is modeled in the co-processor as a stiff boundary, this is a plane-and-probe approach, according to W. Mark et. al. [5]. The co-processor handles the modeling of the constraining walls, aka. the planes, and updates the forces on the joystick handle, aka. the probe, at a high rate. The host computer updates the placement of the walls at a much lower rate. 5.1 Modeling The modeling of the users location consists of two points: a virtual location and a real location. The real location is the same as the position of the input device, i.e. the position that the user wants to be at. The virtual location is where the user is in the virtual world, which is constrained
3 by the structure to be visualized. The limit on the maximum force on the commercial joystick makes it possible for a normal person to place the joysticks handle in an arbitrary position regardless of the force being applied. When the virtual location and the real one is not the identical, all the objects in the database is traversed in order to find out which one or ones, if any, are constraining the movement. An example is given in figure 2. The user is in the location marked by the dot. The active walls, which will be sent to the joystick, are marked by grey. These are the instantaneous boundaries of the movement of the dot. Note that if the dot is moved into the corridor to the right, the upper and lower walls will be moved to represent the new constraints, even if the dot does not interfere with them. Figure 2 Example of wall placement in labyrinth. 5.2 Movement Constraints In order to handle the situation when the user have collided with a wall, we had to expand the model. Because of the limited maximum force capability, we had to handle the situation when the user deeply penetrates the wall. This is shown in figure 3. Previous location New location Desired location Constraining Line Figure 3 Wall constraint The dot marks the real location in the previous time-step. The cross marks the position of the real location. This movement of the virtual location is not permitted, as it would involve it crossing a line. The new virtual location is taken according to a rubber-band principle: It is placed as if the handle and the dot were connected by a spring, with the virtual location gliding at the frictionless surface of the line. That is: we are not taking the handles position as the users absolute location in the structure, but as the desired one.the user is dragging himself around in a elastic band around the structure. In the algorithm the new virtual location is taken as the projection of the desired location on the line. In the case of a straight line this is the point where the normal of the line coincides with the desired location. This approach is similar to the god object method in [6]. One initial problem was that we modeled all objects as mathematical lines and points. This would place the new virtual location exactly on the line, which would then not hinder the movement off the line in the next time-step. In this way the users virtual location tunneled through walls. The solution is to give the virtual location a size, in figure 3 represented by the size of the dot. In this way the new location is first placed on the line, and then bumped to the correct side by inflating a virtual balloon around it. The correct side is determined by comparing the distance to the previous virtual location for new locations bumped to both sides of the line, and choosing the one with the shortest distance. This approach is different to the one which models the softness in the FF due to insufficient maximum force capability as soft objects. If we were to do that, the virtual location would deform the line, and would be able to go through an inside corner consisting of two lines, by pushing them to the side. 5.3 Collision Detection The collision detection is made at every time-step: each line object is asked if it limits the movement of the virtual location to the desired location. The real location is taken as the desired location in the evaluation of the first object. It calculates if it constraints the movement. If it does, it calculates two output data: a new valid desired location taken as the projection of the old one on the line, and the distance between the previous virtual location and the constraining line. This data is then given to the next line object. It does the same calculations, but only changes the valid desired location if it does constrain the movement and the distance calculated is less than the previous smallest one. After all of the line objects have done this calculations, the process is iterated until no line object changes the position. The new valid desired location is then taken as the new virtual location. This process is repeated at a speed of 20 Hz, which is the frequency of updated real locations from the joystick and new walls sent to it. This approach does not use any bounding spheres and boxes as in [7] and [8].
4 Figure 4 Handling of corners In figure 4 is shown the special case of endpoints of constraining lines. Here the straight movement of the virtual location is not legal, as it crosses the line. A intermediate virtual location is taken as the projection of the desired location on the line. As the line ends before it reaches this projection point, we take the endpoint as the intermediate location. The next iteration of the movement gives a valid movement as a line from the intermediate location to the desired one does not cross the constraining line. In this way the virtual location snaps around the end of the line. 5.4 Slanted Lines As the hardware only supports walls at angles of 0, 90, 180 and 270 degrees, we had to approximate slanted lines with two orthogonal ones. (a) Previous virtual location (b) Intermediate location Desired location Figure 5 Approximation of slanted line The dot in figure 5(a) is not constrained by the walls placed to simulate the slanted line. But when the dot is at the constraining line, the walls are placed to simulate the constraint. If the dot is moved along the line, it will have to penetrate the wall a little. The new real location gives a movement of the virtual location that is permitted, and the walls will be moved. The effect of this will be a texture on the surface. This texture will have a dependence on direction, in the same way as velvet. In figure 5b the movement to down-right will be opposed by an ever moved vertical wall, to the upper-left by a horizontal one. system had pre-implemented the wall objects, we could not adjust the behavior of these according to known solutions [9]. Instead we added a viscosity to the whole world, simulating the user dragging around his position in a liquid by a rubber band. As the viscosity dampen the movement, the problem of the user-induced oscillations is diminished. 6 Results The low fidelity consumer grade equipment is useful for visualization tasks. When the high-level representation of objects is used instead of forces, the slow communication channel between the host computer and the device induce no degradation of performance. The more complex shapes in our test application are hard to visualize from the tactile sense only. This as we work within a fixed area, which translates increased complexity of the structure to decreased feature size. These very small structures are blurred by the small force capability from the joystick, as the handle is always penetrating them to a not negligible degree when the user feels the contact force from them. 7 Conclusion The possibilities of using low cost force feedback hardware for visualization tasks are good. The hardware has limitations in comparison with commercial research grade equipment. These limitations are mostly due to limited maximum force capability. Methods of compensating this have been described. The biggest difference to the higher grade systems is the reduced dimensionality of the device: 2D instead of 3D. But if the visualization task can be translated to the 2D domain, the limited devices are a valid option. Acknowledgments We want to thank the Microsoft Hardware Group for supplying us with hardware and software. We also want to thank Certec at Lund University who have supplied the computer resources necessary for the project. 5.5 Rigidity We had to stabilize the system, as the user easily can induce a oscillation between the walls in an narrow corridor due to the limited maximum force of the walls. As the
5 References [1] M. A. Srinivasan, C. Basdogan, Haptics in Virtual Environments: Taxanomy, Research Status, and Challenges, Comput. & Graphics, Vol. 21, No. 4, pp , 1997 [2] H. Z. Tan, VB. Eberman, M. A. Srinivisan, B. Cheng, Human Factors for the Design of Force-reflecting Haptic Interfaces, Dynamic Systems and Control, Vol. 55-1, Book No. G0909A-1994 [3] M. Ouhyoung et.al., A Low-Cost Force Feedback Joystick and its use in PC Video Games, IEEE Trans. on Consumer Electronics, Vol. 41, No. 3 August 1995 [4] B. Bargen, P. Donelly. Inside DirectX, Microsoft Press, 1998, ISBN [5] W. R. Mark, S. C. Randolph, M. Finch, J. M. Van Verth, R. M. Taylor II, Adding Force Feedback to Graphics Systems: Issues and Solutions, Computer Graphics Proceedings, Annual Conference Series, 1996, ACM SIGGRAPH [6] C. B. Zilles, J. K. Salisbury, A Constraint-based god-object Method for Haptic Display, IEEE International Conference on Intelligent Robots and Systems, 1995 [7] D. C. Ruspini, K. Kolarov, O. Khatib, Robust HAptic Display of Graphical Environments, The First PHANToM Users Group workshop, eds. J.K. Salisbury and M.A.Srinivisan, Dedham, MA, Sept [8] D. K. Pai, L.-M. Reissel, Haptic Interaction with Multiresolution Image Curves, Comput. & Graphics, Vol. 21, No. 4, 1997 [9] T. Massie, Taking the Mush Out of HAptics with Infinitely Stiff Walls,The First PHANToM Users Group workshop, eds. J.K. Salisbury and M.A.Srinivisan, Dedham, MA, Sept. 1996
FORCE FEEDBACK. Roope Raisamo
FORCE FEEDBACK Roope Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere, Finland Outline Force feedback interfaces
More informationPROPRIOCEPTION AND FORCE FEEDBACK
PROPRIOCEPTION AND FORCE FEEDBACK Roope Raisamo and Jukka Raisamo Multimodal Interaction Research Group Tampere Unit for Computer Human Interaction Department of Computer Sciences University of Tampere,
More informationThe CHAI Libraries. F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K.
The CHAI Libraries F. Conti, F. Barbagli, R. Balaniuk, M. Halg, C. Lu, D. Morris L. Sentis, E. Vileshin, J. Warren, O. Khatib, K. Salisbury Computer Science Department, Stanford University, Stanford CA
More informationA Movement Based Method for Haptic Interaction
Spring 2014 Haptics Class Project Paper presented at the University of South Florida, April 30, 2014 A Movement Based Method for Haptic Interaction Matthew Clevenger Abstract An abundance of haptic rendering
More informationForce feedback interfaces & applications
Force feedback interfaces & applications Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jukka Raisamo,
More informationCS277 - Experimental Haptics Lecture 2. Haptic Rendering
CS277 - Experimental Haptics Lecture 2 Haptic Rendering Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering A note on timing...
More informationMulti-Rate Multi-Range Dynamic Simulation for Haptic Interaction
Multi-Rate Multi-Range Dynamic Simulation for Haptic Interaction Ikumi Susa Makoto Sato Shoichi Hasegawa Tokyo Institute of Technology ABSTRACT In this paper, we propose a technique for a high quality
More informationComputer Haptics and Applications
Computer Haptics and Applications EURON Summer School 2003 Cagatay Basdogan, Ph.D. College of Engineering Koc University, Istanbul, 80910 (http://network.ku.edu.tr/~cbasdogan) Resources: EURON Summer School
More informationDevelopment Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design
Development Scheme of JewelSense: Haptic-based Sculpting Tool for Jewelry Design S. Wannarumon Kielarova Department of Industrial Engineering, Naresuan University, Phitsanulok 65000 * Corresponding Author
More informationHaptic Rendering CPSC / Sonny Chan University of Calgary
Haptic Rendering CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Outline Announcements Human haptic perception Anatomy of a visual-haptic simulation Virtual wall and potential field rendering
More information2. Introduction to Computer Haptics
2. Introduction to Computer Haptics Seungmoon Choi, Ph.D. Assistant Professor Dept. of Computer Science and Engineering POSTECH Outline Basics of Force-Feedback Haptic Interfaces Introduction to Computer
More informationDiscrimination of Virtual Haptic Textures Rendered with Different Update Rates
Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Seungmoon Choi and Hong Z. Tan Haptic Interface Research Laboratory Purdue University 465 Northwestern Avenue West Lafayette,
More informationCS277 - Experimental Haptics Lecture 1. Introduction to Haptics
CS277 - Experimental Haptics Lecture 1 Introduction to Haptics Haptic Interfaces Enables physical interaction with virtual objects Haptic Rendering Potential Fields Polygonal Meshes Implicit Surfaces Volumetric
More informationBenefits of using haptic devices in textile architecture
28 September 2 October 2009, Universidad Politecnica de Valencia, Spain Alberto DOMINGO and Carlos LAZARO (eds.) Benefits of using haptic devices in textile architecture Javier SANCHEZ *, Joan SAVALL a
More informationTouch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device
Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Andrew A. Stanley Stanford University Department of Mechanical Engineering astan@stanford.edu Alice X. Wu Stanford
More informationIntegrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices
This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic
More informationOverview of current developments in haptic APIs
Central European Seminar on Computer Graphics for students, 2011 AUTHOR: Petr Kadleček SUPERVISOR: Petr Kmoch Overview of current developments in haptic APIs Presentation Haptics Haptic programming Haptic
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationHAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS
The 3rd International Conference on Computational Mechanics and Virtual Engineering COMEC 2009 29 30 OCTOBER 2009, Brasov, Romania HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS A. Fratu 1,
More informationInteractive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1
VR Software Class 4 Dr. Nabil Rami http://www.simulationfirst.com/ein5255/ Audio Output Can be divided into two elements: Audio Generation Audio Presentation Page 4-1 Audio Generation A variety of audio
More informationTouching and Walking: Issues in Haptic Interface
Touching and Walking: Issues in Haptic Interface Hiroo Iwata 1 1 Institute of Engineering Mechanics and Systems, University of Tsukuba, 80, Tsukuba, 305-8573 Japan iwata@kz.tsukuba.ac.jp Abstract. This
More informationAHAPTIC interface is a kinesthetic link between a human
IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, VOL. 13, NO. 5, SEPTEMBER 2005 737 Time Domain Passivity Control With Reference Energy Following Jee-Hwan Ryu, Carsten Preusche, Blake Hannaford, and Gerd
More informationHaptic interaction. Ruth Aylett
Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration
More informationHaptic interaction. Ruth Aylett
Haptic interaction Ruth Aylett Contents Haptic definition Haptic model Haptic devices Measuring forces Haptic Technologies Haptics refers to manual interactions with environments, such as sensorial exploration
More informationCutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery
Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery Claudio Pacchierotti Domenico Prattichizzo Katherine J. Kuchenbecker Motivation Despite its expected clinical
More informationAbstract. 1. Introduction
GRAPHICAL AND HAPTIC INTERACTION WITH LARGE 3D COMPRESSED OBJECTS Krasimir Kolarov Interval Research Corp., 1801-C Page Mill Road, Palo Alto, CA 94304 Kolarov@interval.com Abstract The use of force feedback
More informationRandomized Motion Planning for Groups of Nonholonomic Robots
Randomized Motion Planning for Groups of Nonholonomic Robots Christopher M Clark chrisc@sun-valleystanfordedu Stephen Rock rock@sun-valleystanfordedu Department of Aeronautics & Astronautics Stanford University
More informationFriction & Workspaces
Friction & Workspaces CPSC 599.86 / 601.86 Sonny Chan University of Calgary Today s Agenda Rendering surfaces with friction Exploring large virtual environments using devices with limited workspace [From
More informationAR 2 kanoid: Augmented Reality ARkanoid
AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular
More information1 Sketching. Introduction
1 Sketching Introduction Sketching is arguably one of the more difficult techniques to master in NX, but it is well-worth the effort. A single sketch can capture a tremendous amount of design intent, and
More informationA Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator
International Conference on Control, Automation and Systems 2008 Oct. 14-17, 2008 in COEX, Seoul, Korea A Feasibility Study of Time-Domain Passivity Approach for Bilateral Teleoperation of Mobile Manipulator
More informationAdvanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS
Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS Matt Schikore Yiannis E. Papelis Ginger Watson National Advanced Driving Simulator & Simulation Center The University
More informationMAGNETIC LEVITATION SUSPENSION CONTROL SYSTEM FOR REACTION WHEEL
IMPACT: International Journal of Research in Engineering & Technology (IMPACT: IJRET) ISSN 2321-8843 Vol. 1, Issue 4, Sep 2013, 1-6 Impact Journals MAGNETIC LEVITATION SUSPENSION CONTROL SYSTEM FOR REACTION
More informationPre-Activity Quiz. 2 feet forward in a straight line? 1. What is a design challenge? 2. How do you program a robot to move
Maze Challenge Pre-Activity Quiz 1. What is a design challenge? 2. How do you program a robot to move 2 feet forward in a straight line? 2 Pre-Activity Quiz Answers 1. What is a design challenge? A design
More informationChapter 2 Introduction to Haptics 2.1 Definition of Haptics
Chapter 2 Introduction to Haptics 2.1 Definition of Haptics The word haptic originates from the Greek verb hapto to touch and therefore refers to the ability to touch and manipulate objects. The haptic
More informationA Virtual Environments Editor for Driving Scenes
A Virtual Environments Editor for Driving Scenes Ronald R. Mourant and Sophia-Katerina Marangos Virtual Environments Laboratory, 334 Snell Engineering Center Northeastern University, Boston, MA 02115 USA
More informationProprioception & force sensing
Proprioception & force sensing Roope Raisamo Tampere Unit for Computer-Human Interaction (TAUCHI) School of Information Sciences University of Tampere, Finland Based on material by Jussi Rantala, Jukka
More informationShuguang Huang, Ph.D Research Assistant Professor Department of Mechanical Engineering Marquette University Milwaukee, WI
Shuguang Huang, Ph.D Research Assistant Professor Department of Mechanical Engineering Marquette University Milwaukee, WI 53201 huangs@marquette.edu RESEARCH INTEREST: Dynamic systems. Analysis and physical
More informationCollaborative Virtual Training Using Force Feedback Devices
Collaborative Virtual Training Using Force Feedback Devices Maria Andréia Formico Rodrigues 1, Ricardo Régis Cavalcante Chaves 1, Wendel Bezerra Silva 2 1 Mestrado em Informática Aplicada Centro de Ciências
More informationINTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY
INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,
More informationPhantom-Based Haptic Interaction
Phantom-Based Haptic Interaction Aimee Potts University of Minnesota, Morris 801 Nevada Ave. Apt. 7 Morris, MN 56267 (320) 589-0170 pottsal@cda.mrs.umn.edu ABSTRACT Haptic interaction is a new field of
More informationBeyond: collapsible tools and gestures for computational design
Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationA Generic Force-Server for Haptic Devices
A Generic Force-Server for Haptic Devices Lorenzo Flückiger a and Laurent Nguyen b a NASA Ames Research Center, Moffett Field, CA b Recom Technologies, Moffett Field, CA ABSTRACT This paper presents a
More informationHigh-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control
High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control Pedro Neto, J. Norberto Pires, Member, IEEE Abstract Today, most industrial robots are programmed using the typical
More informationHAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS. Carlos Vázquez Jan Rosell,1
Preprints of IAD' 2007: IFAC WORKSHOP ON INTELLIGENT ASSEMBLY AND DISASSEMBLY May 23-25 2007, Alicante, Spain HAPTIC GUIDANCE BASED ON HARMONIC FUNCTIONS FOR THE EXECUTION OF TELEOPERATED ASSEMBLY TASKS
More informationCollege Park, MD 20742, USA virtual environments. To enable haptic rendering of large datasets we
Continuously-Adaptive Haptic Rendering Jihad El-Sana 1 and Amitabh Varshney 2 1 Department of Computer Science, Ben-Gurion University, Beer-Sheva, 84105, Israel jihad@cs.bgu.ac.il 2 Department of Computer
More informationENGINEERING GRAPHICS ESSENTIALS
ENGINEERING GRAPHICS ESSENTIALS Text and Digital Learning KIRSTIE PLANTENBERG FIFTH EDITION SDC P U B L I C AT I O N S Better Textbooks. Lower Prices. www.sdcpublications.com ACCESS CODE UNIQUE CODE INSIDE
More informationBiomimetic Design of Actuators, Sensors and Robots
Biomimetic Design of Actuators, Sensors and Robots Takashi Maeno, COE Member of autonomous-cooperative robotics group Department of Mechanical Engineering Keio University Abstract Biological life has greatly
More informationROBOT DESIGN AND DIGITAL CONTROL
Revista Mecanisme şi Manipulatoare Vol. 5, Nr. 1, 2006, pp. 57-62 ARoTMM - IFToMM ROBOT DESIGN AND DIGITAL CONTROL Ovidiu ANTONESCU Lecturer dr. ing., University Politehnica of Bucharest, Mechanism and
More informationA Hybrid Actuation Approach for Haptic Devices
A Hybrid Actuation Approach for Haptic Devices François Conti conti@ai.stanford.edu Oussama Khatib ok@ai.stanford.edu Charles Baur charles.baur@epfl.ch Robotics Laboratory Computer Science Department Stanford
More informationHaptic Virtual Fixtures for Robot-Assisted Manipulation
Haptic Virtual Fixtures for Robot-Assisted Manipulation Jake J. Abbott, Panadda Marayong, and Allison M. Okamura Department of Mechanical Engineering, The Johns Hopkins University {jake.abbott, pmarayong,
More information1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.
ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationVisual - Haptic Interactions in Multimodal Virtual Environments
Visual - Haptic Interactions in Multimodal Virtual Environments by Wan-Chen Wu B.S., Mechanical Engineering National Taiwan University, 1996 Submitted to the Department of Mechanical Engineering in partial
More informationHaptic Display of Contact Location
Haptic Display of Contact Location Katherine J. Kuchenbecker William R. Provancher Günter Niemeyer Mark R. Cutkosky Telerobotics Lab and Dexterous Manipulation Laboratory Stanford University, Stanford,
More informationThe Real-Time Control System for Servomechanisms
The Real-Time Control System for Servomechanisms PETR STODOLA, JAN MAZAL, IVANA MOKRÁ, MILAN PODHOREC Department of Military Management and Tactics University of Defence Kounicova str. 65, Brno CZECH REPUBLIC
More informationLDOR: Laser Directed Object Retrieving Robot. Final Report
University of Florida Department of Electrical and Computer Engineering EEL 5666 Intelligent Machines Design Laboratory LDOR: Laser Directed Object Retrieving Robot Final Report 4/22/08 Mike Arms TA: Mike
More informationForce display using a hybrid haptic device composed of motors and brakes
Mechatronics 16 (26) 249 257 Force display using a hybrid haptic device composed of motors and brakes Tae-Bum Kwon, Jae-Bok Song * Department of Mechanical Engineering, Korea University, 5, Anam-Dong,
More informationArtificial Neural Network based Mobile Robot Navigation
Artificial Neural Network based Mobile Robot Navigation István Engedy Budapest University of Technology and Economics, Department of Measurement and Information Systems, Magyar tudósok körútja 2. H-1117,
More informationPerformance Issues in Collaborative Haptic Training
27 IEEE International Conference on Robotics and Automation Roma, Italy, 1-14 April 27 FrA4.4 Performance Issues in Collaborative Haptic Training Behzad Khademian and Keyvan Hashtrudi-Zaad Abstract This
More informationModeling and Experimental Studies of a Novel 6DOF Haptic Device
Proceedings of The Canadian Society for Mechanical Engineering Forum 2010 CSME FORUM 2010 June 7-9, 2010, Victoria, British Columbia, Canada Modeling and Experimental Studies of a Novel DOF Haptic Device
More informationAbstract. 2. Related Work. 1. Introduction Icon Design
The Hapticon Editor: A Tool in Support of Haptic Communication Research Mario J. Enriquez and Karon E. MacLean Department of Computer Science University of British Columbia enriquez@cs.ubc.ca, maclean@cs.ubc.ca
More informationMultimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms. I-Chun Alexandra Hou
Multimodal Virtual Environments: MAGIC Toolkit and Visual-Haptic Interaction Paradigms by I-Chun Alexandra Hou B.S., Mechanical Engineering (1995) Massachusetts Institute of Technology Submitted to the
More informationTEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY
TEACHING HAPTIC RENDERING SONNY CHAN, STANFORD UNIVERSITY MARCH 4, 2012 HAPTICS SYMPOSIUM Overview A brief introduction to CS 277 @ Stanford Core topics in haptic rendering Use of the CHAI3D framework
More informationProf. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)
Project title: Optical Path Tracking Mobile Robot with Object Picking Project number: 1 A mobile robot controlled by the Altera UP -2 board and/or the HC12 microprocessor will have to pick up and drop
More informationwith MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation
with MultiMedia CD Randy H. Shih Jack Zecher SDC PUBLICATIONS Schroff Development Corporation WWW.SCHROFF.COM Lesson 1 Geometric Construction Basics AutoCAD LT 2002 Tutorial 1-1 1-2 AutoCAD LT 2002 Tutorial
More informationShape Memory Alloy Actuator Controller Design for Tactile Displays
34th IEEE Conference on Decision and Control New Orleans, Dec. 3-5, 995 Shape Memory Alloy Actuator Controller Design for Tactile Displays Robert D. Howe, Dimitrios A. Kontarinis, and William J. Peine
More informationElastic Force Feedback with a New Multi-finger Haptic Device: The DigiHaptic
Elastic Force Feedback with a New Multi-finger Haptic Device: The DigiHaptic Géry Casiez 1, Patricia Plénacoste 1, Christophe Chaillou 1, and Betty Semail 2 1 Laboratoire d Informatique Fondamentale de
More informationBRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE
BRAIN CONTROLLED CAR FOR DISABLED USING ARTIFICIAL INTELLIGENCE Presented by V.DIVYA SRI M.V.LAKSHMI III CSE III CSE EMAIL: vds555@gmail.com EMAIL: morampudi.lakshmi@gmail.com Phone No. 9949422146 Of SHRI
More informationRobust Haptic Teleoperation of a Mobile Manipulation Platform
Robust Haptic Teleoperation of a Mobile Manipulation Platform Jaeheung Park and Oussama Khatib Stanford AI Laboratory Stanford University http://robotics.stanford.edu Abstract. This paper presents a new
More informationClassifying 3D Input Devices
IMGD 5100: Immersive HCI Classifying 3D Input Devices Robert W. Lindeman Associate Professor Department of Computer Science Worcester Polytechnic Institute gogo@wpi.edu But First Who are you? Name Interests
More informationDigital inertial algorithm for recording track geometry on commercial shinkansen trains
Computers in Railways XI 683 Digital inertial algorithm for recording track geometry on commercial shinkansen trains M. Kobayashi, Y. Naganuma, M. Nakagawa & T. Okumura Technology Research and Development
More informationRobotic Vehicle Design
Robotic Vehicle Design Sensors, measurements and interfacing Jim Keller July 19, 2005 Sensor Design Types Topology in system Specifications/Considerations for Selection Placement Estimators Summary Sensor
More informationMotion of Robots in a Non Rectangular Workspace K Prasanna Lakshmi Asst. Prof. in Dept of Mechanical Engineering JNTU Hyderabad
International Journal of Engineering Inventions e-issn: 2278-7461, p-isbn: 2319-6491 Volume 2, Issue 3 (February 2013) PP: 35-40 Motion of Robots in a Non Rectangular Workspace K Prasanna Lakshmi Asst.
More informationIntroduction to ANSYS DesignModeler
Lecture 4 Planes and Sketches 14. 5 Release Introduction to ANSYS DesignModeler 2012 ANSYS, Inc. November 20, 2012 1 Release 14.5 Preprocessing Workflow Geometry Creation OR Geometry Import Geometry Operations
More informationAutoCAD LT 2009 Tutorial
AutoCAD LT 2009 Tutorial Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS Schroff Development Corporation www.schroff.com Better Textbooks. Lower Prices. AutoCAD LT 2009 Tutorial 1-1 Lesson
More informationAutoCAD LT 2012 Tutorial. Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS. Schroff Development Corporation
AutoCAD LT 2012 Tutorial Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS www.sdcpublications.com Schroff Development Corporation AutoCAD LT 2012 Tutorial 1-1 Lesson 1 Geometric Construction
More informationVisuo-Haptic Interface for Teleoperation of Mobile Robot Exploration Tasks
Visuo-Haptic Interface for Teleoperation of Mobile Robot Exploration Tasks Nikos C. Mitsou, Spyros V. Velanas and Costas S. Tzafestas Abstract With the spread of low-cost haptic devices, haptic interfaces
More informationAn Introduction To Modular Robots
An Introduction To Modular Robots Introduction Morphology and Classification Locomotion Applications Challenges 11/24/09 Sebastian Rockel Introduction Definition (Robot) A robot is an artificial, intelligent,
More informationMultirate Simulation for High Fidelity Haptic Interaction with Deformable Objects in Virtual Environments
Proceedings of the 2000 IEEE International Conference on Robotics & Automation San Francisco, CA April 2000 Multirate Simulation for High Fidelity Haptic Interaction with Deformable Objects in Virtual
More informationSDC. AutoCAD LT 2007 Tutorial. Randy H. Shih. Schroff Development Corporation Oregon Institute of Technology
AutoCAD LT 2007 Tutorial Randy H. Shih Oregon Institute of Technology SDC PUBLICATIONS Schroff Development Corporation www.schroff.com www.schroff-europe.com AutoCAD LT 2007 Tutorial 1-1 Lesson 1 Geometric
More informationVisual Perception Based Behaviors for a Small Autonomous Mobile Robot
Visual Perception Based Behaviors for a Small Autonomous Mobile Robot Scott Jantz and Keith L Doty Machine Intelligence Laboratory Mekatronix, Inc. Department of Electrical and Computer Engineering Gainesville,
More informationPeter Berkelman. ACHI/DigitalWorld
Magnetic Levitation Haptic Peter Berkelman ACHI/DigitalWorld February 25, 2013 Outline: Haptics - Force Feedback Sample devices: Phantoms, Novint Falcon, Force Dimension Inertia, friction, hysteresis/backlash
More informationFastener Modeling for Joining Parts Modeled by Shell and Solid Elements
2007-08 Fastener Modeling for Joining Parts Modeled by Shell and Solid Elements Aleander Rutman, Chris Boshers Spirit AeroSystems Larry Pearce, John Parady MSC.Software Corporation 2007 Americas Virtual
More information3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray
Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User
More informationVia Stitching. Contents
Via Stitching Contents Adding Stitching Vias to a Net Stitching Parameters Clearance from Same-net Objects and Edges Clearance from Other-net Objects Notes Via Style Related Videos Stitching Vias Via
More informationApplying Model Mediation Method to a Mobile Robot Bilateral Teleoperation System Experiencing Time Delays in Communication
Applying Model Mediation Method to a Mobile Robot Bilateral Teleoperation System Experiencing Time Delays in Communication B. Taner * M. İ. C. Dede E. Uzunoğlu İzmir Institute of Technology İzmir Institute
More informationaspexdraw aspextabs and Draw MST
aspexdraw aspextabs and Draw MST 2D Vector Drawing for Schools Quick Start Manual Copyright aspexsoftware 2005 All rights reserved. Neither the whole or part of the information contained in this manual
More informationGroup Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -
Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation July 16-20, 2003, Kobe, Japan Group Robots Forming a Mechanical Structure - Development of slide motion
More informationExpression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch
Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch Vibol Yem 1, Mai Shibahara 2, Katsunari Sato 2, Hiroyuki Kajimoto 1 1 The University of Electro-Communications, Tokyo, Japan 2 Nara
More informationHaptic Display of Multiple Scalar Fields on a Surface
Haptic Display of Multiple Scalar Fields on a Surface Adam Seeger, Amy Henderson, Gabriele L. Pelli, Mark Hollins, Russell M. Taylor II Departments of Computer Science and Psychology University of North
More informationME Week 2 Project 2 Flange Manifold Part
1 Project 2 - Flange Manifold Part 1.1 Instructions This project focuses on additional sketching methods and sketching commands. Revolve and Work features are also introduced. The part being modeled is
More informationHaptic Rendering and Volumetric Visualization with SenSitus
Haptic Rendering and Volumetric Visualization with SenSitus Stefan Birmanns, Ph.D. Department of Molecular Biology The Scripps Research Institute 10550 N. Torrey Pines Road, Mail TPC6 La Jolla, California,
More informationElements of Haptic Interfaces
Elements of Haptic Interfaces Katherine J. Kuchenbecker Department of Mechanical Engineering and Applied Mechanics University of Pennsylvania kuchenbe@seas.upenn.edu Course Notes for MEAM 625, University
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationAutoCAD Tutorial First Level. 2D Fundamentals. Randy H. Shih SDC. Better Textbooks. Lower Prices.
AutoCAD 2018 Tutorial First Level 2D Fundamentals Randy H. Shih SDC PUBLICATIONS Better Textbooks. Lower Prices. www.sdcpublications.com Powered by TCPDF (www.tcpdf.org) Visit the following websites to
More informationTA Instruments RSA-G2 Dynamic Mechanical Analyzer
The new RSA-G2 is the most advanced platform for mechanical analysis of solids from the world s leading supplier of DMA instrumentation. This new highperformance instrument represents the fourth generation
More information3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks
3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks David Gauldie 1, Mark Wright 2, Ann Marie Shillito 3 1,3 Edinburgh College of Art 79 Grassmarket, Edinburgh EH1 2HJ d.gauldie@eca.ac.uk, a.m.shillito@eca.ac.uk
More informationSystem Inputs, Physical Modeling, and Time & Frequency Domains
System Inputs, Physical Modeling, and Time & Frequency Domains There are three topics that require more discussion at this point of our study. They are: Classification of System Inputs, Physical Modeling,
More informationDistributed Vision System: A Perceptual Information Infrastructure for Robot Navigation
Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp
More information