Towards Learning to Identify Zippers

Similar documents
Learning to Detect Doorbell Buttons and Broken Ones on Portable Device by Haptic Exploration In An Unsupervised Way and Real-time.

Acquisition of Multi-Modal Expression of Slip through Pick-Up Experiences

Sensing the Texture of Surfaces by Anthropomorphic Soft Fingertips with Multi-Modal Sensors

Sensing Ability of Anthropomorphic Fingertip with Multi-Modal Sensors

Texture recognition using force sensitive resistors

Salient features make a search easy

Learning the Proprioceptive and Acoustic Properties of Household Objects. Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Touch Sensors for Humanoid Hands

Interactive Identification of Writing Instruments and Writable Surfaces by a Robot

From Encoding Sound to Encoding Touch

LUCS Haptic Hand I. Abstract. 1 Introduction. Magnus Johnsson. Dept. of Computer Science and Lund University Cognitive Science Lund University, Sweden

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Dropping Disks on Pegs: a Robotic Learning Approach

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Modelling and Simulation of Tactile Sensing System of Fingers for Intelligent Robotic Manipulation Control

Object Exploration Using a Three-Axis Tactile Sensing Information

Booklet of teaching units

CAPACITIES FOR TECHNOLOGY TRANSFER

System Approach: A paradigm for Robotic Tactile Sensing

Sensor system of a small biped entertainment robot

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

A Tactile Sensor for the Fingertips of the Humanoid Robot icub

Robotica Umanoide. Lorenzo Natale icub Facility Istituto Italiano di Tecnologia. 30 Novembre 2015, Milano

Haptic Material Classification with a Multi-Channel Neural Network

arxiv: v1 [cs.ro] 27 Jun 2017

Experiments with Haptic Perception in a Robotic Hand

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Toward Interactive Learning of Object Categories by a Robot: A Case Study with Container and Non-Container Objects

these systems has increased, regardless of the environmental conditions of the systems.

Haptic Invitation of Textures: An Estimation of Human Touch Motions

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

A developmental approach to grasping

Biomimetic Design of Actuators, Sensors and Robots

Slip detection with accelerometer and tactile sensors in a robotic hand model

Chapter 1 Introduction

ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS)

Figure 2: Examples of (Left) one pull trial with a 3.5 tube size and (Right) different pull angles with 4.5 tube size. Figure 1: Experimental Setup.

Lecture 23: Robotics. Instructor: Joelle Pineau Class web page: What is a robot?

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

Haptic Feedback in Robot Assisted Minimal Invasive Surgery

IIT PATNA PLACEMENT BROCHURE M.TECH MECHATRONICS

ADVANCED CABLE-DRIVEN SENSING ARTIFICIAL HANDS FOR EXTRA VEHICULAR AND EXPLORATION ACTIVITIES

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

2. Introduction to Computer Haptics

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Robot: icub This humanoid helps us study the brain

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Stabilize humanoid robot teleoperated by a RGB-D sensor

Haptic Perception with a Robotic Hand

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Design of Cylindrical Whole-hand Haptic Interface using Electrocutaneous Display

Push Path Improvement with Policy based Reinforcement Learning

Haptic Rendering CPSC / Sonny Chan University of Calgary

World Automation Congress

Haptic Display of Contact Location

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

Grasp Mapping Between a 3-Finger Haptic Device and a Robotic Hand

Computer Haptics and Applications

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

Dimensional Reduction of High-Frequency Accelerations for Haptic Rendering

Kid-Size Humanoid Soccer Robot Design by TKU Team

Design and Control of the BUAA Four-Fingered Hand

The Shape-Weight Illusion

Proprioception & force sensing

Elements of Haptic Interfaces

Multisensory Based Manipulation Architecture

Introduction. ELCT903, Sensor Technology Electronics and Electrical Engineering Department 1. Dr.-Eng. Hisham El-Sherif

Localized HD Haptics for Touch User Interfaces

NAIST Openhand M2S: A versatile two-finger gripper adapted for pulling and tucking textiles

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

Rapid Categorization of Object Properties from Incidental Contact with a Tactile Sensing Robot Arm

CONTACT: , ROBOTIC BASED PROJECTS

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CB 2 : A Child Robot with Biomimetic Body for Cognitive Developmental Robotics

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

3-Degrees of Freedom Robotic ARM Controller for Various Applications

Effects of Longitudinal Skin Stretch on the Perception of Friction

Estimating Friction Using Incipient Slip Sensing During a Manipulation Task

Building Perceptive Robots with INTEL Euclid Development kit

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Haptic presentation of 3D objects in virtual reality for the visually disabled

Wearable PZT sensors for distributed soft contact sensing (Design and Signal Conditioning Manual)

Humanoid Hands. CHENG Gang Dec Rollin Justin Robot.mp4

COVENANT UNIVERSITY NIGERIA TUTORIAL KIT OMEGA SEMESTER PROGRAMME: MECHANICAL ENGINEERING

Tactile sensing system using electro-tactile feedback

Dynamics of Ultrasonic and Electrostatic Friction Modulation for Rendering Texture on Haptic Surfaces

Sensing self motion. Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Real-Time Intelligent Gripping System for Dexterous Manipulation of Industrial Robots

A Tactile Display using Ultrasound Linear Phased Array

Newsletter of the IEEE TCH. Issue 3

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

Lab 1: Testing and Measurement on the r-one

HAPTIC A PROMISING NEW SOLUTION FOR AN ADVANCED HUMAN-MACHINE INTERFACE

Transcription:

HCI 585X Sahai - 0

Contents Introduction... 2 Motivation... 2 Need/Target Audience... 2 Related Research... 3 Proposed Approach... 5 Equipment... 5 Robot... 5 Fingernail... 5 Articles with zippers... 6 Robot Behaviors... 6 Setup... 6 Methodology... 7 Evaluation... 8 Research Team... 8 HCI 585X Sahai - 1

Introduction Motivation Artificial intelligence systems can only be as smart as the environment that they are allowed to explore. [Robert Sutton] The future of developmental robotics requires robots to act autonomously. Programming robots to perform the smallest tasks has proven difficult at best. One simple task that could be useful in many applications is teaching a robot how to open a zipper. Potential applications include opening dangerous suitcases, robots in space working with vacuum sealed zippers, packing our lunch in zip-lock bags, putting our winter coats on, or tucking us into our sleeping bags at night. Also, similar methods of teaching could help a robot to identify other objects besides zippers, and in general learn more about its environment. One of the methods of zipper detection presented in this proposal is using tactition. Similar to the way a human can feel different surfaces, robots will feel for a certain vibration. Secondly, a microphone on the finger serves as the ear to listen to the noise the finger makes when it rubs over the zipper. Finally, a camera mounted on the robot s head will watch where the finger is and what s going on during the zipper vibrations and when the noise on the microphone comes in. Combining these three senses will allow the robot to detect where a zipper is. The benefit to using more than one or two senses is that the robot can ideally detect zippers in a wider variety of conditions. For example, if the lights are off, the camera won t do the robot any good. But the robot can still listen and feel around to find the zipper. Or in some mediums the noise will be diminished or non-existent and the microphone won t pick up anything. The same will work if the environment doesn t allow for tactition with the finger. Need/Target Audience There are several different approaches to solving the problem of identifying zippers. The simplest is to program the way a zipper feels, looks, and sounds like. However, this is the most brittle approach because it is highly susceptible to the specific environment that the experiment would be programmed in. Consider the variations in the environment of the zipper. Some zippers are partially underneath another fabric. Some zippers are much larger or smaller than others. Zippers are made of different materials, with different roughness. This is why it s best to develop a way for the robot to identify the functional parts of the zipper. The robot will be given a variety of clothes and possibly bags, and a feedback system to tell the robot when it has correctly identified the location of a zipper. It will then develop a multi-modal model that will allow it to interact accurately with new kinds of zippers. The target audience for this project is any organization that needs a flexible means of teaching a robot to learn and develop multiple modalities by interacting with the environment. This is necessary for a robot to perform any complex task on the same level as humans. According to Alan Turing a computer would deserve to be called intelligent if it could deceive a human into believing that it was human. In my opinion, this project would be on the path to true artificial intelligence if it explores the nature with which robots can learn from their environment. HCI 585X Sahai - 2

Related Research Findings in psychology have shown that the tactile sensory modality is necessary to capture many object properties (e.g., roughness, texture, etc.). More specifically, psychologists and neuroscientists have demonstrated that certain receptors in the skin are capable of detecting minute vibrations as the finger slides across a surface, thus enabling discrimination between fine textures. According to Lederman and Klatzky, tactile object exploration is facilitated by exploratory procedures. For example, to detect the roughness of a surface, a person might slide his finger across it; to detect its temperature; a person might touch it, and so forth. Studies have also shown that tactile exploratory behaviors are commonly used by infants when exploring a novel object. For example, Stack et al. have reported that, in the absence of visual cues, 7-month-old infants use more efficient tactile exploratory strategies and can perform tactile surface recognition to some extent. The importance of the sense of touch for biological organisms has lead to an increased interest in tactile sensors and their applications in robotics. For example, the goal of the ROBOSKIN project, which was recently funded by the European Commission, is to develop novel touch sensors for an artificial skin that can cover large patches of the robot s body. The robotic skin is designed with flexible and modular components that can be easily reconfigured to the body morphology of a new robot. An early prototype of the skin has already been installed on the icub robot. Another goal of the project is to use the skin sensor during social learning tasks, in which a human provides corrective feedback by touching the robot s hand to indicate a desired movement direction. Other research has focused on developing tactile sensing technologies for robotic fingers. For example, Howe et al. have developed a robotic finger with an artificial rubber skin, equipped with a piezoelectric polymer transducer that measures the changes in pressure induced as the sensor slides over a surface. It was shown that minute features (as small as 6.5 μm) could be detected on surfaces by sliding the sensor across them. Computer vision methods for surface perception have also been explored. Tanaka et al. developed an artificial finger that uses strain gauges and PVDF foil to generate tactile feedback when sliding across a surface. In subsequent experiments, they demonstrate how their sensor can detect roughness and temperature changes in the textures of six different fabrics. A similar sensor was developed by Hosoda et al. By applying two different exploratory behaviors pushing and rubbing their robot was able to distinguish between five different materials. A robotic finger with randomly distributed strain gauges and PVDF films was also proposed by Jamali and Sammut. In their experiments, a Naive Bayes classifier coupled with the Fourier coefficients of the sensor s output was used to recognize eight different surface textures. HCI 585X Sahai - 3

Three-axis force sensors have also been used for tactile perception. For example, Beccai et al. used a 3-axis MEMS sensor to perform slip detection. A similar sensor was also used by de Boissieu et al. to capture the high frequency vibrations that occur when rubbing a surface. In that study, the force sensor was mounted on a plotter printer and was able to distinguish between 10 different paper surfaces with reasonable accuracy (approximately 61%). In another line of research, inexpensive accelerometers have been proposed by Romano et al. for the purposes of recording and reproducing tactile sensations. Howe et al. have also developed a sensor that can detect tactile vibrations using a 3-axis accelerometer, providing feedback that was useful for detecting if an object has moved after being grasped (i.e., slip detection). They estimated that the sensor s output was most dependent on the sliding speed, somewhat dependent on the surface roughness, and least dependent on the applied normal force. Handelman et.al. developed an unmanned ground vehicle with arms, legs and wheels. Using this unmanned vehicle and two 6-DOF spaceball buttons, an operator can unzip and inspect the contents of a knapsack. This study was done in order to assist humans, decrease workload, and increase robot agility. The sensor that will be used for this project is very similar to a 3-axis accelerometer previously introduced by Sukhoy et al. to capture vibrotactile feedback. In contrast to previous work, the humanoid robot described here will perform behaviors to identify zippers attached to bags and clothing. HCI 585X Sahai - 4

Proposed Approach Equipment Robot All experiments will be performed with an upper-torso humanoid robot. Two 7-DOF Barrett Whole Arm Manipulators (WAMs) are used for the robot s arms. Each WAM has a three-finger Barrett Hand (BH8-262) as its end effector. The WAMs are mounted in a configuration similar to that of human arms. The arms are controlled in real time from a Linux PC at 500 Hz over a CAN bus interface. The robot is also equipped with two cameras (Quickcams from Logitech). The cameras capture 640x480 color images at 30 fps. Fingernail The robot s fingernail consists of a printed circuit board (PCB) with an ADXL345 3-axis digital accelerometer, an electret microphone CMA-6542PF, an AD7416 temperature sensor and a PIC18F2550 microcontroller. The accelerometer has a bandwidth of 1600Hz, at a sample rate of HCI 585X Sahai - 5

3200Hz. To prevent aliasing, the accelerometer has an on-board digital low-pass filter. The fingernail was designed as a part of senior design group and manufactured at Advanced Circuits. The sensor, along with its dimensions, is shown in Figure 2. The fingernail is connected to the robot computer via USB. This fingernail fits into the upper segment of one of robot s finger. For this project, the data will be recorded from the accelerometer, microphone and the camera. The microphone sound input will be recorded at 44.1 KHz. The collected data will then be transferred to the PC over a USB bus using the microcontroller on the PCB. Articles with zippers The robot will interact with several clothes and possibly bags during the experiment. The clothes and bags will have a variety of material properties. The only restriction on the kind of cloths that will be used is that they can be laid flat on an ironing board. This is because an ironing board will be kept stationery in front of the robot and the clothes will be mounted on it. The robot will then interact with them and record data. Robot Behaviors The robot will perform three behaviors for each trial: 1. Position the hand at a random start position on the surface. 2. Scratch perpendicular to the zipper. 3. Move the hand away from the surface. The articles will be placed in front of the robot at the start of each trial. Setup The following images are taken from the camera that s on the robot. This is an example of the setup of this project. The green colored lines indicate an example trajectory the robot might take for a trial. The orange rectangle around the trajectories indicates the box which restricted the trajectory and the exploration stayed within that box. HCI 585X Sahai - 6

Methodology Data Collection Experimental data will be collected during the scratching behavior. This interaction will be captured from the robot s camera as a sequence of 640 x 480 color images, the accelerometer in robot s fingernail as well as the microphone sound input. An example trial would start like this: 1. Clothing will placed on the ironing board in front of the robot 2. The robot then randomly chooses a motion to follow within the box. 3. It will then scratch along the trajectory. 4. While the robot is scratching the surface, data from the mic, camera and accelerometer will be recorded. 5. If there is more clothing to scratch or more trials, repeat steps 1-4 Zipper Detection using Accelerometer Data Relating the position of the zipper (found using the timestamps on the accelerometer data) back to a visual frame. Learn a visual model of a zipper. HCI 585X Sahai - 7

Evaluation For this project, the robot will be evaluated on how well it can recognize a zipper. Success would be defined as the ability of the robot to find out in each trial whether or not there was a zipper. This can then be extended to other surface irregularities. The robot could then be given clothing with texture breaks and check if the robot would be able to distinguish that from a zipper. Research Team Ritika Sahai Ritika Sahai is currently a Master's student in Computer Engineering. I work with Dr. Alex Stoytchev in his Developmental Robotics Lab. I have presented a paper on learning to identify writing instruments and writable surfaces at a Mobile Manipulation Workshop. As a part of the developmental robotics lab, I have also worked on vibrotactile recognition of surface textures. For my senior design project, I worked with another person to make the fingernail with the different sensors (accelerometer, microphone, temperature sensor and microcontroller) and connect it to the robot computer via USB. HCI 585X Sahai - 8

References S. Lederman and R. Klatzky, Haptic classification of common objects: knowledge-driven exploration, Cognitive Psychology, vol. 22, pp. 421 459, 1990. D. Stack and M. Tsonis, Infants Haptic Perception of Texture in the Presence and Absence of Visual Cues, British Journal of Developmental Psychology, vol. 17, pp. 97 110, 1999. W. M. Bergmann-Tiest and A. M. L. Kappers, Haptic and visual perception of roughness, Acta Psychologica, vol. 124, no. 2, pp. 177 189, 2007. D. Lynott and L. Connell, Modality Exclusivity Norms for 423 Object Properties, Behavior Research Methods, vol. 41, no. 2, 2009. G. Cannata, M. Maggiali, G. Metta, and G. Sandini, An embedded artificial skin for humanoid robots, in Proc. of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, 2008. B. Argall, E. Sauser, and A. Billard, Tactile guidance for policy refinement and reuse, in Proceedings of the IEEE International Conference on Development and Learning (ICDL), Ann Arbor, MI, Aug 18-21 2010. M. Tanaka, J. Levequem, H. Tagami, K. Kikuchi, and S. Chonan, The haptic finger a new device for monitoring skin condition, Skin Research and Technology, vol. 9, no. 1, pp. 131 136, 2003. [16] R. Howe and M. Cutkosky, Dynamic tactile sensing: perception of fine surface features with stress rate sensing, Robotics and Automation, IEEE Transactions on, vol. 9, no. 2, pp. 140 151, 1993. K. Hosoda, Y. Tada, and M. Asada, Anthropomorphic robotic soft fingertip with randomly distributed receptors, Robotics and Autonomous Systems, vol. 54, no. 2, pp. 104 109, 2006. N. Jamali and C. Sammut, Material classification by tactile sensing using surface textures, in Proc. of the 2010 IEEE International Conference on Robotics and Automation (ICRA), May 2010, pp. 2336 2341. L. Beccai, S. Roccella, A. Arena, F. Valvo, P. Valdastri, A. Menciassi, M. C. Carrozza, and P. Dario, Design and fabrication of a hybrid silicon three-axial force sensor for biomechanical applications, Sensors and Actuators A: Physical, vol. 120, no. 2, pp. 370 382, 2005. F. de Boissieu, C. Godin, C. Serviere, and D. Baudois, Tactile texture recognition with a 3- axial force mems integrated artificial finger, in Proceedings of the 2009 Robotics Science and Systems Conference (RSS), Seattle, WA. J. Romano, T. Yoshioka, and K. Kuchenbecker, Automatic filter design for synthesis of haptic textures from recorded acceleration data, in Proceedings of the IEEE International Conference on Robotics and Automation, Anchorage, AK, 2010, pp. 1815 1821. K. Kuchenbecker, Haptography: capturing the feel of real objects to enable authentic haptic rendering, in Proceedings of Haptic in Ambient Systems (HAS) Workshop, Quebec City, Canada, Feb 2008. R. Howe and M. Cutkosky, Sensing skin acceleration for slip and texture perception, in Proceedings of the IEEE International Conf. on Robotics and Automation, 1989, pp. 145 150. V. Sukhoy, R. Sahai, J. Sinapov, and A. Stoytchev, Vibrotactile recognition of surface textures by a humanoid robot, in Proceedings of the Humanoids 2009 Workshop Tactile Sensing in Humanoids Tactile Sensors and Beyond, Paris, France, Dec 7, 2009, pp. 57 60. D. Handelman, G. Franken, H. Komsuoglu, Agile and dexterous robot for inspection and EOD operations [http://www.americanandroid.com/dhandelman_spie_2010.pdf] HCI 585X Sahai - 9