Haptic Sensing and Perception for Telerobotic Manipulation

Similar documents
Haptic Sensors and Interfaces

Complementary Tactile Sensor and Human Interface for Robotic Telemanipulation

Haptic Perception System For Robotic Tele-Manipulation

Symbiotic Human-Computer Interaction

Interactive Virtual Environments

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Haptic Sensors & Interfaces for Robotic Telemanipulation

Haptic Perception & Human Response to Vibrations

Intelligent Haptic Sensor System for Robotic Manipulation

ROBOTIC tactile sensing systems for object recognition

Lecture 7: Human haptics

From Encoding Sound to Encoding Touch

Haptic presentation of 3D objects in virtual reality for the visually disabled

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

Texture recognition using force sensitive resistors

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

Robotic, Human, and Symbiotic Sensor Agents

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Remote Tactile Transmission with Time Delay for Robotic Master Slave Systems

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Flexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations

Haptic Sensors and Interfaces for Interactive Dexterous Robotic Telemanipulation

Haptic interaction. Ruth Aylett

Touch. Touch & the somatic senses. Josh McDermott May 13,

Bibliography. Conclusion

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Haptics CS327A

R (2) Controlling System Application with hands by identifying movements through Camera

Proprioception & force sensing

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Computer Assisted Medical Interventions

Evaluation of Five-finger Haptic Communication with Network Delay

Output Devices - Non-Visual

Shape Memory Alloy Actuator Controller Design for Tactile Displays

Multi-Modal Robot Skins: Proximity Servoing and its Applications

Peter Berkelman. ACHI/DigitalWorld

2. Introduction to Computer Haptics

Virtual Environments. Ruth Aylett

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

A Glove Interface with Tactile feeling display for Humanoid Robotics and Virtual Reality systems

Selective Stimulation to Skin Receptors by Suction Pressure Control

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

The Haptic Impendance Control through Virtual Environment Force Compensation

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Haptic Discrimination of Perturbing Fields and Object Boundaries

Haptic Media Construction and Utilization of Human-harmonized "Tangible" Information Environment

CAPACITIES FOR TECHNOLOGY TRANSFER

Design of Cylindrical Whole-hand Haptic Interface using Electrocutaneous Display

Passive Bilateral Teleoperation

Beyond Visual: Shape, Haptics and Actuation in 3D UI

WEARABLE HAPTIC DISPLAY FOR IMMERSIVE VIRTUAL ENVIRONMENT

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Exploring Surround Haptics Displays

Booklet of teaching units

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

FORCE FEEDBACK. Roope Raisamo

Information and Program

Hybrid architectures. IAR Lecture 6 Barbara Webb

can easily be integrated with electronics for signal processing, etc. by fabricating

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

The Magic Glove. H. Kazerooni, D. Fairbanks, A. Chen, G. Shin University of California at Berkeley Berkeley, California

Telecommunication and remote-controlled

Lecture 3: Sensors, signals, ADC and DAC

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

Haptic interaction. Ruth Aylett

Haptic Communication for the Tactile Internet

TACTILE SENSING & FEEDBACK

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

ACTUATORS AND SENSORS. Joint actuating system. Servomotors. Sensors

Aural and Haptic Displays

Design and Control of the BUAA Four-Fingered Hand

Feeding human senses through Immersion

Electrical stimulation of mechanoreceptors

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Tactile sensing system using electro-tactile feedback

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL

Development of Thermal Displays for Haptic Interfaces

Haptic Virtual Fixtures for Robot-Assisted Manipulation

Behaviour-Based Control. IAR Lecture 5 Barbara Webb

Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau.

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

A Tactile Display using Ultrasound Linear Phased Array

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Haptic User Interfaces Fall Contents TACTILE SENSING & FEEDBACK. Tactile sensing. Tactile sensing. Mechanoreceptors 2/3. Mechanoreceptors 1/3

Gripper Telemanipulation System for the PR2 Robot. Jason Allen, SUNFEST (EE), University of the District of Columbia Advisor: Dr. Camillo J.

Building Perceptive Robots with INTEL Euclid Development kit

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

Abstract. Introduction. Threee Enabling Observations

Modelling and Simulation of Tactile Sensing System of Fingers for Intelligent Robotic Manipulation Control

Introduction to Haptics

Acquisition of Multi-Modal Expression of Slip through Pick-Up Experiences

Blind navigation with a wearable range camera and vibrotactile helmet

Robust Haptic Teleoperation of a Mobile Manipulation Platform

World Automation Congress

¾ B-TECH (IT) ¾ B-TECH (IT)

JEPPIAAR ENGINEERING COLLEGE

Sensory and Perception. Team 4: Amanda Tapp, Celeste Jackson, Gabe Oswalt, Galen Hendricks, Harry Polstein, Natalie Honan and Sylvie Novins-Montague

Transcription:

Haptic Sensing and Perception for Telerobotic Manipulation Emil M. Petriu, Dr. Eng., P.Eng., FIEEE Professor School of Information Technology and Engineering University of Ottawa Ottawa, ON., K1N 6N5 Canada http://www.site.uottawa.ca/~petriu

Telepresence for Remote Robotic Applications Teleoperation has been developed as an extension of the human intervention capability in remote or hazardous environments. Remotely operated robots designed to work in environments that are unknown or imprecisely known. Other than planetary applications, this type of robot control is used for work in deep sea,harsh operating environments, or remote medical operations. Telepresence allowins human teleoperators to experience the feeling that they are in remote environments.

Head Mounted Display Haptic Feedback Virtual model of the object manipulated in the physical world Video Camera Robot Arm Tactile Sensors Manipulated Object Video and haptic virtual reality interfaces allow a human operator to remotely control a robot manipulator equipped with video camera and tactile sensors.

Human Haptic System [Burdea& Coiffet 2003] G. Burdea and Ph. Coiffet, Virtual Reality Technology, (second edition), Wiley, New Jersey, 2003 Two-point limen test: 2.5 mm fingertip, 11 mm for palm, 67 mm for thigh (from [Burdea& Coiffet 2003] ).

Robotic Haptic System The robotic telemanipulation system has a bilateral architecture allowing to connect the human operator and the telerobotic manipulator as transparently as possible. Using a head mounted display for augmented visual virtual reality and a haptic feedback system, the human operator controls the operation of a remote robot manipulator equipped with video camera and tactile sensors placed in robot s hand. The tactile sensors provide the cutaneous information at the remote robotic operation site. The joint sensors of the robot arm provide the kinesthetic information. A tactile human-computer interface provides the cutaneous feedback allowing the human operator to feel with his/her own sense of touch the same sensation as that acquired by the remote robot hand from its artificial tactile sensor. A robot-like kinematic structure provides the kinesthetic feedback for the haptic human-computer interface.

Multi Sensor Data Fusion Sensory data gathered from vision, joint encoders, and tactile sensors are integrated in a unique framework which allows one to deal in a common way with all the properties of the manipulated object, These include the object s 3D geometr/shape, its surface-material properties, and the contact forces occurring during object handling by the robot. This framework also solves sensor redundancy problems occurring when more sensors than actually required are used to measure a given object parameter. The multi-sensor data fusion system has a hierarchical architecture with an ascending sensory processing path in parallel with a descending task-decomposition control path connected via world models at each level. The processing time at each level is reduced by the use of a priori knowledge from the world model that provides predictions to the sensory system. The use of a world model promotes modularity because the specific information requirements of the sensory and control hierarchies are decoupled.

Multi Sensor Data Fusion (continued) The time clutch concept is used to disengage synchrony between operator specification time and telerobot operation time during path specifications. In order to avoid fatal errors and reduce the effect of the communication delay, we are using a distributed virtual environment allowing to maintain a shared world model of the physical environment where the telemanipulation operation is conducted]. Another critical requirement is the need to maintain the synchronism between the visual and the haptic feedback. While being two distinct sensing modalities, both haptic perception and vision measure the same type of geometric parameters of the 3D objects that are manipulated.

COMPOSITE WORLD MODEL Local Connection Remote Connection VIDEO MONITOR TELEOPERATOR Object Identities and POSEs TACTILE MONITOR & JOYSTICK ONBOARD COMPUTER Task TASK PLANNER OBJECT RECOGNITION Trajectory Constraints TRAJECTORY PARAMETRS ESTIMATION GEOMETRIC WORLD MODEL POSITION MODEL TRAJECTORY PLANNER Path Specifications Position Specifications Robot Position FRAME TRANSFORMS FRAME TRANSFORMS ROBOT MODEL Raster Image Wheel Position WHEEL/STEER ENCODERS JOINT/WHEEL SERVOS Actuator I.R. RANGE SENSORS VISION TACTILE SENSOR ENVIRONMENT ROBOT Model-based telepresence control of a robot

Haptic Control for Object Manipulation. Human tactile perception is a complex act with two distinct modes. First, passive sensing, which is produced by the "cutaneous" sensory network, provides information about contact force, contact geometric profile and temperature. Second, active sensing integrates the cutaneous sensory information with "kinesthetic" sensory information (the limb/joint positions and velocities). Tactile sensing is the result of a deliberate exploratory perception act. Since the cutaneous information provided by the tactile probe is local, it is necessary to add a "carrier" (kinesthetic capability) to allow the probe to move around on the explored object surface. The local cutaneous information provided by the different touch frames is integrated with the kinesthetic position parameters of the carrier resulting in a "global view" (geometric model) of the explored object.

The robotic manipulator consists of a 5-axis commercial robot, an instrumented passive-compliant wrist and a 16-by-16 tactile probe. Position sensors placed in the robot s joints and on the instrumented passive-compliant wrist provide the kinesthetic information. The compliant wrist allows the robot-hand equipped with tactile sensors to accommodate the constraints of the explored object surface and thus to increase the amount of cutaneous tactile information.

Instrumented passive compliant wrist for the tactile exploration of objects. The compliant wrist allows the robot-hand equipped with tactile sensors to accommodate the constraints of the explored object surface and thus to increase the amount of cutaneous tactile information.

The tactile probe is based on a 16-by-16 matrix of Force Sensing Resistor (FSR) elements spaced 1.58 mm apart on a 6.5 cm2 (1 square inch) area. The FSR elements exhibit exponentially decreasing electrical resistance with applied normal force: the resistance changes by two orders of magnitude over a pressure range of 1 N/cm2 to 100 N/cm2.

FORCE SENSITIVE TRANSDUCER EXTERNAL FORCE 3D OBJECT y z x 2D- SAMPLING ELASTIC OVERLAY Tab-shaped elastic overlay. Protruding round tabs allow the material to expand without in the x and y directions and compress in the z direction, reducing the elastic overlay's blurring effect.

The elastic overlay has a protective damping effect against impulsive contact forces and its elasticity resets the transducer system when the probe ceases to touch the object. However, they may cause considerable blurring distortions in the sensing process if they are not properly controlled. We avoided it by replacing the one-piece pad with a custom-designed elastic overlay consisting of a relatively thin membrane with protruding round tabs. This construction allows the material to expand without any stress in the x and y directions making possible its compression in the z direction proportionally with the normal stress component. The tabs are arranged in a 16-by-16 array Having a tab on top of each node of the FSR matrix. This tab configuration provides a de facto spatial sampling, which reduces the elastic overlay's blurring effect on the high 2D sampling resolution of the FSR transducer. Experimental results Illustrating the positive effect of the tab-shaped overlay. 16-by-16 median filtered tactile image of a washer.

Human - Computer Interfaces Human oriented sensory interfaces allow the teleoperator to experience virtual reality feelings of visual, force, and tactile nature. These interfaces should have easily perceivable and task-related sensor information displays (monitors) in such away to enhance the teleoperator's control capabilities.

The Human Haptic Perception The haptic perception is a complex exploratory act integrating two distinct modes: (i) a cutaneous tactile sensor provides information about contact force, contact geometric profile and the temperature of the touched object, (ii) a kinesthetic sensory system provides information about the positions and velocities of the kinematic structure (e.g. hand) carrying the tactile sensor. There are various cutaneous mechanoreceptors located in the outer layers of the skin. These receptors are specialized nervous cells or neurons. The free nerve endings are the most numerous and play an active role in the perception of pain, cold and warmth. These mechanoreceptors have preferential frequency response characteristics: the highest sensitivity for the Pacinian Corpuscles (PC) units is around 250-300 Hz but they respond from 30 Hz to very high frequencies. The Rapidly Adapting (RA) units effective frequency range is between 10 and 200 Hz, with more sensitivity below 100 Hz. The Slowly Adapting (SA) units respond at low frequencies, under 40-50 Hz. The human kinesthetic function has a much lower frequency band.

Immersionn_3D Interaction <http://www.immersion.com/> CyberGlove Uses 18-22 linear sensors electrical strain gauges; Angles are obtained by measuring voltages on a Wheastone bridge; 112 gestures/sec filtered. Sensor resolution 0.5 degrees, but errors accumulate to the fingertip (open kinematic chain); Sensor repeatability 1 degree Needs calibration when put on the hand

Immersionn_3D Interaction <http://www.immersion.com/> CyberGrasp CyberGrasp Pack CyberForce CyberTouch

Commercial Virtual Hand Toolkit for CyberGlove/Grasp providing the kinesthetic human-computer interface

Performance comparison of various sensing gloves (from [Burdea& Coiffet 200

Haptic Feedback to the Human Operator R O B O T - H A N D TS TS TACTILE IMAGE ACQUISITION H U M A N - H A N D TM TM TACTILE SENSATION RECONSTRUCTION TS = Tactile Sensor TM = Tactile Monitor A tactile monitor placed on the operator's palm should Allow the human teleoperator to virtually feel by touch the object profile measured by the tactile sensors placed in the jaws of the robot gripper.

Tactile Display for Computer-Human Interface Cutaneous tactile monitor developed at the University of Ottawa in the early 90s. It consists of an 8-by-8 array of electromagnetic vibrotactile stimulators. The active area is 6.5 cm2 (same as the tactile sensor).

Pseudo-Random Multi-Valued Sequences (PRMVS) A more compact absolute position encoding can be obtained by using Pseudo-Random Multi-Valued Sequences (PRMVS) where sequence elements are entries taken from an alphabet with more than two symbols. p = 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 0 1 1 A 2 1 0 A A 1 A 0 A 2 A 2 A A 2 q P = p. q Origin Pointer As an example, a two stage shift register, n=2, having the feedback defined by the primitive polynomial h(x)= x 2 +x+a over GF(4) ={0,1,A,A 2 }, with A 2 +A+1=0 and A3=1, generates the 15-term PRMVS {0, 1, 1, A 2, 1, 0, A, A, 1, A, 0, A 2, A 2, A, A 2 }. Any 2-tuple seen through a 2-position window sliding over this sequence is unique.

Each stimulator corresponds to a 2-by-2 window in the tactile sensor array. The vibrotactile stimulators are used as binary devices that are activated when at least two of the corresponding taxels (tactile elements) in the tactile sensor array window are "on". The figure shows a curved edge tactile feedback.

Robotic Tactile Recognition of Pseudo-Random Encoded Objects Pseudo-Random Binary Encoding A practical solution allowing absolute position recovery with any desired n-bit resolution while employing only one binary track, regardless of the value of n. R(0) = R(n)??c(n-1) R(n-1)????c(1) R(1) R(0) R(n) R(n-1) R(k) R(2) R(1) Table 1 Feedback equations for PRBS generation Shift register length n Feedback for direct PRBS R(0)= R(n)??c(n-1) R(n-1)????c(1) R(1) Feedback for reverse PRBS R(n+1)= R(1)??b(2) R(2)????b(n) R(n) 4 R(0) = R(4)??R(1) R(5) = R(1)??R(2) 5 R(0) = R(5)??R(2) R(6) = R(1)??R(3) 6 R(0) = R(6)??R(1) R(7) = R(1)??R(2) 7 R(0) = R(7)??R(3) R(8) = R(1)??R(4) 8 R(0) = R(8)??R(4) R(9) = R(1)??R(3)??R(3)??R(2)??R(4)? R(5) 9 R(0) = R(9)??R(4) R(10) = R(1)??R(5) 10 R(0) = R(10)??R(3) R(11) = R(1)??R(4)

p = 0 5 10 15 20 25 30 PRBS= 0 0 0 0 1 0 1 0 1 1 1 0 1 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 1 A (2 n -1) term Pseudo-Random Binary Sequences (PRBS) generated by a n-bit modulo-2 feedback shift register is used as an one-bit / quantization-step absolute code. The absolute position identification is based on the PRBS window property. According to this any n-tuple seen through a n-bit window sliding over PRBS is unique and henceforth it fully identifies each position of the window.

The figure shows, as an example, a 31-bit term PRBS: 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 1, 0, 1, 1, 0, 0, 0, 1, 1, 1, 1, 1, 0, 0, 1, 1, 0,1, 0, 0, 1, generated by a 5-bit shift register. The 5-bit n-tuples seen through a window sliding over this PRBs are unique and represent a 1-bit wide absolute position code. p=0 5 10 15 20 25 30 Origin q P=p q Pointer Q(0) Q(1) Q(2) Q(3) p = 0 5 10 15 20 25 30 PRBS = 0 0 0 01 01 0 1 1 1 0 1 1 0 0 0 1 1 1 1 1 0 0 1 1 0 1 0 0 1 Serial-parallel code conversion of the absolute position p=18 on a 31-position PRBS encoded track with four milestones.

n q=3 q=4 q=8 q=9 2 x 2 +x+2 x 2 +x+a x 2 +Ax+A x 2 +x+a 3 x 3 +2x+1 x 3 + x2+ x+a x 3 +x+a x 3 +x+a 4 x 4 +x+2 x + +x 2 +Ax+A 2 x 4 +x+a 3 x 4 +x+a 5 5 x 5 +2x+1 x 5 +x+a x 5 +x 2 +x+a 3 x 5 +x 2 +A 6 x 6 +x+2 x 6 +x 2 +x+a x 6 +x+a x 6 +x 2 +Ax+A 7 x 7 +x 6 +x 4 +1 x 7 +x 2 +Ax+A 2 x 7 +x 2 +Ax+A 3 x 7 +x+a 8 x 8 +x 5 +2 x 8 +x 3 +x+a 9 x 9 +x 7 +x 5 +1 x 9 +x 2 +x+a 10 x 10 +x 9 +x 7 +2 x 10 +x 3 +A(x 2 +x+1) The following relations apply: for GF(4)= GF(2 2 ): A 2 +A+1=0, A 2 =A+1, and A 3 =1 for GF(8)= GF(2 3 ): A 3 +A+1=0, A 3 =A+1, A 4 =A 2 +A, A 5 =A 2 +A+1, for GF(9)= GF(3 2 ): A 6 =A 2 +1, and A 7= 1 A 2 +2A+2=0, A 2 =A+1, A 3 =2A+1, A 4 =2, A 5 =2A, A 6 =2A+2, A 7 =A+2, and A 8 =1 According to the PRMVS window property any q-valued contents observed through a n-position window sliding over the PRMVS is unique and fully identifies the current position of the window.

0 A 1 A 2 A A 2 A 2 A 2 1 1 A 2 A 2 A 2 A A 2 1 A 0 0 1 A 2 A 2 A 1 0 A 2 A 2 0 1 A A 2 A 2 1 0 0 A 2 0 0 A A A 2 1 A 2 A 2 1 A 2 A A 0 0 A 2 0 1 A 1 A 0 A 2 A 0 0 A A 2 0 A 1 A 1 0 A 2 A 2 A 0 A 2 0 1 1 1 1 0 A 2 0 A A 2 A 2 0 A 2 A 1 A2 1 1 1 A A 1 1 1 A 2 1 A A 2 0 0 A 1 1 A 2 A 0 1 1 0 A A 2 1 1 A 0 0 1 0 0 A 2 A 2 1 A 1 1 A 1 A 2 A 2 0 0 1 0 A A 2 A A 2 0 1 A 2 0 0 A 2 1 0 A 2 A A 2 A 0 1 1 A 2 0 1 0 A A A A 0 1 0 A 2 1 1 0 1 A 2 A 1 A A A A 2 A 2 A A A 1 A A 2 1 0 0 A 2 A A 1 A 2 0 A A 0 A 2 1 A A A 2 0 0 A 0 0 1 1 A A 2 A A A 2 A 1 1 0 0 A 0 A 2 1 A 2 1 0 A 1 0 0 1 A 0 1 A 2 1 A 2 0 A A 1 0 A 0 A 2 A 2 A 2 A 2 0 A 0 1 A A 15-by-17 PRA obtained by folding a 255 element PRS defined over GF(4), with q=4, n=4, k1=2, k2=2, n1= q k1-1=15, and n2=(q n -1)/n1=17

Z V(k,4) Z V(r,4) V(r,6) V(r,5) V(k,2) Y Y O(k) V(k,1) V(k,3) X O(r) V(r,1) V(r,3) i j 00 01 02 03 04 05 06 07... i-2 i-1 i i+1 i+2 i+3 i+4 i+5... V(r,2) 00 01 02 03 04 05 06 j-3 j-2 j-1 j+1j+2j+3j+4 N-...... j 1 V(k,3) V(k,1) V(k,2) V(k,3) V(k,3) V(r,3) V(r,6) V(k,4) V(r,2) V(r,1) V(r,4) V(r,5) V(r,6) V(r,3) V(r,6) X Y M-3 M-2 M-1 X V(r,1) V(r,2) V(r,3) V(r,1) V(r,4) PRA code elements are Braille-like embossed on object surfaces. 3D object models are unfolded and mapped on the encoding pseudo-random array.

PRA code elements are Braille-like embossed on the object surface. The shape of the embossed symbols is specially designed for easy tactile recognition. For an efficient pattern recognition, the particular shape of the binary symbols were selected in such a way to meet the following conditions: (i) there is enough information at the symbol level to provide an immediate indication of the grid orientation, (ii) the symbol recognition procedure is invariant to position, and orientations, (iii) the symbols have a certain peculiarity so that other objects in the scene will not be mistaken for encoding symbols. The binary symbols used to mark "0" and respectively "1" are recognized on the basis of the number of end-points and vertices. The symbol on the left representing "0" has 2 end-points and 1 vertex-point and the symbol on the right representing "1" has 3 end-point and 2 vertex-points.

0 1 A A 2 The shape of the four code symbols for a PRA over GF(4) embossed on object s surface

½ 1¼ ½ 1 The 15-by-17 PRA defined over GF(4) embossed using the four code symbols

C4 C3 C1 C2 C8 C7 C5 C6 P4 P3 P1 P2 P8 P7 P5 P6 The vertex labeled models of two simple 3D objects

C4 C1 C3 C4 P4 P3 P1 P2 P1 P2 C8 C5 C7 C8 C1 C2 C1 C4 P5 P6 P5 P6 C5 C6 C2 C3 C2 C3 C8 C5 P8 P7 P8 P7 C6 P4 C7 P1 C7 P2 P3 C6 P4 P3 P8 P5 P6 P7 Mapping the embossed PRBA on the surfaces of the two 3D objects

Tactile images of the four GF(4) encoding symbols

Composite tactile image of four symbols on an encoded object surface

C4 C1 C3 C2 C5 C7 C6 The four tactile recovered symbols are recognized (using a NN) and, using the PRA window property their location is unequivocally identified on the face of one of the 3D objects

Current Research Objectives Object recognition by integrating haptic and visual sensory data into a composite haptic & vision model of 3D object shape, surface and material properties (e.g. texture, elasticity,heat transfer characteristics), and contact forces and slippage. Development of a haptic & vision model-based coding for distributed interactive virtual reality applications. Using a unique composite haptic & vison model of the manipulated objects could be more efficient than using separate vision and haptic data streams. Such an integral approach makes sense as the role of haptics is to enhance the vision. Both haptic perception and vision measure the same type of geometric parameters of the 3D objects that are manipulated. Study of human-computer interaction aspects of the haptic & visual perception for interactive distributed virtual environments. The aim is the development of a more efficient haptic feedback enhancing the visual feedback to human operators during virtual object manipulation.