WIRELESS VOICE CONTROLLED ROBOTICS ARM

Similar documents
CONTACT: , ROBOTIC BASED PROJECTS

RF(433Mhz) BASED PROJECTS

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

ROBOT FOR BIOMEDICAL APPLICATIONS CONTROLLED BY REGIONAL LANGUAGE

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

3-Degrees of Freedom Robotic ARM Controller for Various Applications

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Wirelessly Controlled Wheeled Robotic Arm

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Pick and Place Robotic Arm Using Arduino

Gesture Recognition with Real World Environment using Kinect: A Review

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Chapter 1 Introduction to Robotics

Design of Multi Lingual, Voice Signal Frequency Based Robotic Hand Control System

GESTURE BASED ROBOTIC ARM

Features: 1. User friendly interfacing. 2. Controls high voltage water pumps. 3. Identification of water pumps through RFID technology.

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

FLEX SENSOR BASED ROBOTIC ARM CONTROLLER: DEVELOPMENT

VECTOR QUANTIZATION-BASED SPEECH RECOGNITION SYSTEM FOR HOME APPLIANCES

ROBOTICS & EMBEDDED SYSTEMS

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

Voice Guided Military Robot for Defence Application

Tele-Operated Anthropomorphic Arm and Hand Design

VOICE CONTROLLED ROBOT FOR SURVEILLANCE AND GAS LEAKAGE DETECTION

Formation and Cooperation for SWARMed Intelligent Robots

Embedded Robotics. Software Development & Education Center

Gesture Controlled Car

KINECT CONTROLLED HUMANOID AND HELICOPTER

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

A Model Based Approach for Human Recognition and Reception by Robot

Robot: icub This humanoid helps us study the brain

Electronics Design Laboratory Lecture #11. ECEN 2270 Electronics Design Laboratory

Development of a telepresence agent

VISUAL FINGER INPUT SENSING ROBOT MOTION

II. MAIN BLOCKS OF ROBOT

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

High-Level Programming for Industrial Robotics: using Gestures, Speech and Force Control

R (2) Controlling System Application with hands by identifying movements through Camera

Design and Implementation of Integrated Smart Township

Service Robots in an Intelligent House

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Introduction to robotics. Md. Ferdous Alam, Lecturer, MEE, SUST

A*STAR Unveils Singapore s First Social Robots at Robocup2010

Computational Principles of Mobile Robotics

JNTU World. Introduction to Robotics. Materials Provided by JNTU World Team. JNTU World JNTU World. Downloaded From JNTU World (

Intelligent Tactical Robotics

I. INTRODUCTION MAIN BLOCKS OF ROBOT

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

Autonomous Vehicle Speaker Verification System

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Omni-Directional Catadioptric Acquisition System

Voice Command Based Robotic Vehicle Control

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Issues in Information Systems Volume 13, Issue 2, pp , 2012

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

War Field Spying Robot With Night Vision Camera

Hardware Implementation of an Explorer Bot Using XBEE & GSM Technology

Embedded & Robotics Training

Controlling Humanoid Robot Using Head Movements

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Lecture 23: Robotics. Instructor: Joelle Pineau Class web page: What is a robot?

Car Over-Speed Detection with Remote Alerting

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

FUNDAMENTALS ROBOT TECHNOLOGY. An Introduction to Industrial Robots, T eleoperators and Robot Vehicles. D J Todd. Kogan Page

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

JEPPIAAR ENGINEERING COLLEGE

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

Voice Activated Hospital Bed, Herat Beat, Temperature Monitoring and Alerting System

Optimal Driving System for Two Wheelers

Chapter Sixteen. Inventing the Future

Review on Wireless Controlled Spy Robot

ARTIFICIAL INTELLIGENCE - ROBOTICS

Design and Development of Blind Navigation System using GSM and RFID Technology

Arduino Based Robot for Pick and Place Application

Virtual Grasping Using a Data Glove

The Real-Time Control System for Servomechanisms

Research Seminar. Stefano CARRINO fr.ch

Wheeled Mobile Robot Kuzma I

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

A Wireless Smart Sensor Network for Flood Management Optimization

Design of WSN for Environmental Monitoring Using IoT Application

Visvesvaraya Technological University, Belagavi

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

Voice based Control Signal Generation for Intelligent Patient Vehicle

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Performance Analysis of Ultrasonic Mapping Device and Radar

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

What is a robot. Robots (seen as artificial beings) appeared in books and movies long before real applications. Basilio Bona ROBOTICS 01PEEQW

Collaborative Robotic Navigation Using EZ-Robots

Virtual Reality Calendar Tour Guide

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Design of Removable Guardrail System Based on ZigBee Network

National Aeronautics and Space Administration

Transcription:

WIRELESS VOICE CONTROLLED ROBOTICS ARM 1 R.ASWINBALAJI, 2 A.ARUNRAJA 1 BE ECE,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA 2 ME EST,SRI RAMAKRISHNA ENGINEERING COLLEGE,COIMBATORE,INDIA aswinbalaji94@gmail.com arunraja21@gmail.com ABSTRACT- In today s world, in almost all sectors, most of the work is done by robots or robotic arm having different number of degree of freedoms (DOF s) as per the requirement. This paper deals with the Design and Implementation of a Voice Controlled Robotic Arm. The system design is divided into 3 parts namely: Voice recognition module, Robotic Arm and Platform. Arm with Voice Recognition is to create a wireless voice controlled arm which can be operated through a range of 10 to 50 meters using ZIGBEE transmitter and receiver. Voice recognition is "the technology by which sounds, words or phrases spoken by humans are converted into electrical signals, and these signals are transformed into coding patterns to which meaning has been assigned". The different motions performed by robotic arm are: PICK and PLACE / DROP, RAISING and LOWERING the objects. Also, the motions performed by the platform are: FORWARD, BACKWARD, RIGHT and LEFT. Keywords: Voice recognition, DOF, zigbee Module, Gripper, stepper motor I. INTRODUCTION A robot may define as an electro-mechanical device, which is capable of sensing its surrounding and taking its decision (command).in general, robot must be able to move (by mechanical movement), it must be able to sense (by transducer) and it should be take decision (by remote control or artificial intelligence). A robotic arm is a robot manipulator, which can perform similar functions to a human arm. Robotics arm is vital role of industrial application. Most robotics arm perform the task such as welding, trimming, picking, placing and painting etc., Moreover the biggest advantage of these arms is that it can work in hazardous areas and also in the areas which cannot be accessed by human Few variants are Keypad Controlled, Voice Control, Gesture Control, etc. However, most of the industrial robots are still programmed using the typical teaching process which is still a tedious and time-consuming task that requires technical expertise. Therefore, there is a need for new and easier ways for programming the robots. The prime aim of this project is the platform started with movement as soon as the voice command receive by operator. if the voice is not audible then the alternative method is remote control accessing Robot The goal of this paper is to develop methodologies that help users to control and program a robot, with a high-level of abstraction from the robot specific language i.e. to simplify the robot programming II. RELATED WORK In the robotics field, several research efforts have been directed towards recognizing human gestures. Few popular systems are: 2.1 VISION-BASED GESTURE RECOGNITION This Recognition system basically worked in the field of Service Robotics and the researchers finally designed a Robot performing the cleaning task. They designed a gesture-based interface to control a mobile robot equipped with a manipulator. The interface uses a camera to track a person and recognize gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person reliably through office environments with changing lighting conditions. Two gesture recognition methods i.e. a template based approach and a neural based approach were compared and combined with the Viterbi algorithm for the recognition of gestures defined through arm motion. It results in an interactive clean-up task, where the user guides the robot to go to the specific locations that need to be cleaned and also instructs the robot to pick up trash. 2.2 MOTION CAPTURE SENSOR RECOGNITION This recognition technique made it possible to implement an accelerometer based system to communicate with an industrial robotic arm wirelessly. In this particular project 33

the robotic arm is powered with ARM7 based LPC1768 core. MEMS is a three dimensional accelerometer sensor which captures gestures of human-arm and produces three different analog output voltages in three dimensional axes. And two flex sensors are used to control the gripper movement. 2.3 FINGER GESTURE RECOGNITION SYSTEM BASED ON ACTIVE TRACKING MECHANISMS The prime aim of the system (based on the above mentioned recognition methodology) proposed by the author is to make it feasible to interact with a portable device or a computer through the recognition of finger gestures. Apart from the gestures, speech can also be other mode of interaction because of which this system can form part of a so-called Perceptual User Interface (PUI). The system could be used for Virtual Reality or Augmented Reality systems. 2.4 ACCELEROMETER BASED GESTURE RECOGNITION This Gesture Recognition methodology has become increasingly popular in a very short span of time. The lowmoderate cost and relative small size of the accelerometers are the two factors that makes it an effective tool to detect and recognize human body gestures. Several studies have been conducted on the recognition of gestures from acceleration data using Artificial Neural Networks (ANNs) III. TECHNICAL REQUIREMENTS The technical requirements chosen as a basis for the efficient functioning of the system are as follows: 3.1 MICROCONTROLLER PIC microcontroller is used as the hardware platform. It is the controlling unit, to which all other components (Voice recognition, Motors, RF modules etc.) are interfaced. Two such microcontrollers are used in this project, one at the Transmitting end and one at the Receiving end. 3.2 ZIGBEE MODULE ZigBee is a specification for a suite of high-level communication protocols used to create personal area networks built from small, low-power digital radios. ZigBee is based on an IEEE 802.15 standard though its low power consumption limits transmission distances to 10 100 meters line-of-sight depending on power output and environmental characteristics, ZigBee devices can transmit data over long distances by passing data through a mesh network of intermediate devices to reach more distant ones. ZigBee is typically used in low data rate applications that require long battery life and secure networking (ZigBee networks are secured by 128 bit symmetric encryption keys.) ZigBee has a defined rate of 250 Kbit/s, best suited for intermittent data transmissions from a sensor or input device.that we used CC2500 transceiver module APPLICATIONS Home Entertainment and Control Wireless sensor networks Industrial control Embedded sensing Medical data collection Smoke and intruder warning Building automation 34

Back Propagation Algorithm (BPA), Fast Fourier Transform (FFT), Learn Vector Quantization (LVQ), Neural Network (NN). 3.3 VOICE RECOGNITION Speech recognition is the process of converting an acoustic signal, captured by microphone or a telephone, to a set of words. There two important part of in Speech Recognition Recognize the series of sound and Identified the word from the sound. Hardware module is Voice Extreme Module HM2007 - Speech Recognition Chip OKI VRP6679 Voice Recognition Processor Speech Commander - Verbex Voice Systems The Spoken Language interface should be in English Language The robot should understand the task from the dialogue. The system should be speaker independent The robot should have some user feedback; such as, if the robot doesn t understand the user commands, it gives the user feedback - I don t understand The robot should understand the dialogue. SENTENCE PURPOSE Forward Move in forward Backward Move in backward Right Turn right Left Turn left Upward Move arm in upward Downward Move arm in downward Pick Arm gripper pick the object Drop Arm gripper drop/place the object The most popular and dominated technique in last two decade is Hidden Markov Models. There are other techniques also use for SR system Artificial Neural Network (ANN), 35

IV. OVERALL DESIGN OF THE SYSTEM In this paper, a robotic arm with three degrees of freedom is designed, which is able to pick the desired object and place them at the desired location Based on functionality, the system has been categorized into the following parts:- Robotic arm Platform Communication system [2] ROBOTIC ARM This is the vital part of the system as it is this part which does the Pick and Drop task of the project. The robotic arm is equipped with a Gripper (for picking and placing the objects) and an Arm (for raising and lowering the objects),both the Arm and Gripper are equipped with Servo Motor to control the movement. These movements are synchronized with the voice commands of the user, operating the Robotic Arm. Also, the different voice commands, shown in Figure 4, are describedd below: Downward: To Lower the Arm Upward: To Raise the Arm Pick: To close the Gripper Mouth so that object it can pick the Drop: To open the Gripper Mouth so that it can place / drop the object 4.2 PLATFORM(ROBOTIC MOVEMENT) Platform is nothing but that part of the project onto which the Robotic Arm is mounted. The platform is fitted with Stepper Motors and its movement is synchronized with the voice command of the user, operating the Robotic Arm. It is this part of the project which takes the entire project from one place to another. Forward: To make the platform move in Forward Backward: To make the platform move in Backward Right: To make the platform take a turn towards Right Left: To make the platform take a turn towards Left 4.3 COMMUNICATION SYSTEM(VOICE RECONGIZATION &ZIGBEE TRANSCEIVER) This part is the heart of the entire project. Without an effective and reliable communication system, no system 36

/ project can work. Similar is the case with this project also. The ZIBGEE Module, details of which are mentioned under Section 3.2, is the only communication equipment required in this project. This Module is used to transmit the different voice commands or remote control made by the user (encoded in the form of 4-bit digital data) wirelessly to the receiver, which decodes the received 4-bit digital data and according to which the arm, gripper and platform moves. The block diagrams shown in depict the entire communication system of the project. The Linker (Circle, named A ). V. CONCULSION Human-Robot interaction is an important, attractive and challenging area in HRI. The Service Robot popularity gives the researcher more interest to work with user interface for robots to make it more user friendly to the social context. Speech Recognition (SR) technology gives the researcher the opportunity to add Natural language (NL) communication with robot in natural and even way. The working domain of the Service Robot is in the society -to help the people in every day s life and so it should be controlled by the human. Our future work will focus on introducing more complex activities and sentence to the system and also introducing the non-speech sound recognition, like footsteps (close), footsteps (distant) etc. Humans normally use gestures such as pointing to an object or a with the spoken language, i.e., when the human speaks with another human about a close object or location, they normally point at the object/location by using their fingers. This interface called multi-modal communication interface BLOCK DIAGRAM This block diagram show the connection between the Transmitter End (orange in color) and the Receiver End (brown in color) VI. ACKNOWLEDGMENT We are greatly thankful to our project coordinator and guidea.arunraja, M.E Assistant Professor, department of Embedded System Technology, for valuable guidance and motivation, which helped us to complete this project on time. We thank all our teaching and non-teaching staff members of the Electronics and Communication department for their passionate support, for helping us to identify our mistakes and also for the appreciation they gave us in achieving our goal. The information and resources that helped us to complete the project successfully. Also, we would like to record our deepest gratitude to my parents for their constant encouragement and support which motivated us to complete our project on time REFERENCE [1] Register Science Editor Abram Katz. Operating room computers obey voice commands. New Haven Register.com. 27 December 2001, http://www.europe.stryker.com/i-su uite/de/new haven- yale.pdf (visited 2005-08-15). [2]Braitenberg Vehicles: Networks on Wheels, http://www.mindspring.com/_gerke en/vehicles (visited 2005-11-24). [3] Rodney A. Brooks, Cynthia Breazeal, Matthew Marjanovic, Brian Scassellati, and Matthew M. Williamson. The cog project: Building a humanoid robot. Lecture Notes in Computer Science,1562:52 87, 1999. citeseer.ist.psu.edu/brooks99cog.htm ml (visited 2005-10-05). [4] Guido Bugmann. Effective spoken interfaces to service robots: open problems. In AISB 05:Social Intelligence and Interaction in Animal, Robots and Agents-SSAISB 2005 Convention, pages 18 22, Hatfield,UK, April 2005. 37

[5] Michael Cowling and Renate Site. Analysis of speech recognition techniques for use in a non-speech sound recognition system. http://www.elec.uow.edu.au/staff/wysocki/dspcspapers/004.pdf (visited 2005-07-11). [6] Survey of the state of the art in human language technology. Cambridge University Press ISBN 0-521-59277-1,1996. Sponsored by the National Science Foundation and European Union, Additional support was provided by: Center for Spoken Language Understanding, Oregon Graduate Institute, USA and University of Pisa, Italy, http://www.cslu.ogi.edu/hltsurvey/ (visited 2005-07-11). [7] Gregory Dudek and Michael Jenkin. Computational Principles of Mobile Robotics. The Press Syndicate of the University of Cambridge, Cambridge, UK, first edition,2000. [8] Shafkat Kibria Speech Recognition for Robotic control, December 2005 38