ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Similar documents
SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Gesture Recognition with Real World Environment using Kinect: A Review

Design of WSN for Environmental Monitoring Using IoT Application

DTMF Controlled Robot

VISUAL FINGER INPUT SENSING ROBOT MOTION

Implementation of a Self-Driven Robot for Remote Surveillance

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

War Field Spying Robot With Night Vision Camera

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

Advance Load Sharing System And Theft Detection System

Embedded Robotics. Software Development & Education Center

International Journal for Research in Applied Science & Engineering Technology (IJRASET) DTMF Based Robot for Security Applications

CONTACT: , ROBOTIC BASED PROJECTS

Live Hand Gesture Recognition using an Android Device

Soldier Tracking and Health Indication System Using ARM7 LPC-2148

Smart Navigation System for Visually Impaired Person

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

INTELLIGENT SELF-PARKING CHAIR

International Journal of Modern Trends in Engineering and Research e-issn No.: , Date: April, 2016

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Critical Design Review: M.A.D. Dog. Nicholas Maddy Timothy Dayley Kevin Liou

Human Computer Interaction by Gesture Recognition

Advanced menu ordering system in restaurants

WifiBotics. An Arduino Based Robotics Workshop

Cortex-M3 based Prepaid System with Electricity Theft Control

3D ULTRASONIC STICK FOR BLIND

Zig-Bee Robotic Panzer

Controlling Humanoid Robot Using Head Movements

Controlling Robot through SMS with Acknowledging facility

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

Training Schedule. Robotic System Design using Arduino Platform

III. MATERIAL AND COMPONENTS USED

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

WIRELESS VOICE CONTROLLED ROBOTICS ARM

WAR FIELD SPYING ROBOT WITH NIGHT VISION CAMERA

International Journal OF Engineering Sciences & Management Research

Voice Guided Military Robot for Defence Application

HUMANOID ROBOT FOR REMOTE SURVEILLANCE

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Design and Development of Pre-paid electricity billing using Raspberry Pi2

Design and Implementation of an Unmanned Ground Vehicle

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Wirelessly Controlled Wheeled Robotic Arm

Gesture Controlled Robot with Wireless Camera Monitoring

Divya Singh, Manisha Verma, Rachana Sahu, Shruti Kantode, Shailendra Singh

Designing an Embedded System for Autonomous Building Map Exploration Robot

Simulation Of Radar With Ultrasonic Sensors

SPY ROBOTIC MODULE USING ZIGBEE

Vehicle parameter detection in Cyber Physical System

Gesticulation Based Smart Surface with Enhanced Biometric Security Using Raspberry Pi

R (2) Controlling System Application with hands by identifying movements through Camera

COMPARATIVE PERFORMANCE ANALYSIS OF HAND GESTURE RECOGNITION TECHNIQUES

DTMF based Surveillance Robot

ADVANCED EMBEDDED MONITORING SYSTEM FOR ELECTROMAGNETIC RADIATION

Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

International Journal of Advance Engineering and Research Development

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

A Real Time Static & Dynamic Hand Gesture Recognition System

Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot

An IoT Based Real-Time Environmental Monitoring System Using Arduino and Cloud Service

DESIGN OF A DEVICE FOR CHECKING THE CONTINUITY IN ELECTRICAL CIRCUIT

IMPLEMENTATION OF EMBEDDED SYSTEM FOR INDUSTRIAL AUTOMATION

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Gesture Controlled Car

Voice Command Based Robotic Vehicle Control

Voice Activated Hospital Bed, Herat Beat, Temperature Monitoring and Alerting System

Team KMUTT: Team Description Paper

Intelligent Tactical Robotics

The Making of a Kinect-based Control Car and Its Application in Engineering Education

Li-Fi Based Voice Control Robot

CATALOG. ANALOG COMMUNICATION SYSTEMS DIGITAL COMMUNICATION SYSTEMS Microcontroller kits Arm controller kits PLC Trainer KITS Regulated Power supplies

RFID- GSM- GPS Imparted School Bus Transportation Management System

Virtual Eye for Blind People

Design and Implementation of AT Mega 328 microcontroller based firing control for a tri-phase thyristor control rectifier

International Journal of Advance Engineering and Research Development. Wireless Control of Dc Motor Using RF Communication

Review on Wireless Controlled Spy Robot

Study of M.A.R.S. (Multifunctional Aero-drone for Remote Surveillance)

In this lecture, we will look at how different electronic modules communicate with each other. We will consider the following topics:

II. LITERATURE SURVEY

Four Quadrant Speed Control of DC Motor with the Help of AT89S52 Microcontroller

AMBULANCE TRACKING AND ALTERNATE ROUTING

Devastator Tank Mobile Platform with Edison SKU:ROB0125

Intelligent and passive RFID tag for Identification and Sensing

Design Concept of State-Chart Method Application through Robot Motion Equipped With Webcam Features as E-Learning Media for Children

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering. (An ISO 3297: 2007 Certified Organization)

32-bit ARM Cortex-M0, Cortex-M3 and Cortex-M4F microcontrollers

Hardware Implementation of an Explorer Bot Using XBEE & GSM Technology

Master Thesis Presentation Future Electric Vehicle on Lego By Karan Savant. Guide: Dr. Kai Huang

Car Over-Speed Detection with Remote Alerting

Design and Implementation of Integrated Smart Township

Design of Accelerometer Based Robot Motion and Speed Control with Obstacle Detection

RF Based Pick and Place Robot

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

AI Application Processing Requirements

Transcription:

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION ABSTRACT *Miss. Kadam Vaishnavi Chandrakumar, ** Prof. Hatte Jyoti Subhash *Research Student, M.S.B.Engineering College, Latur, India **Prof., Dept. of Electronics, M.S.B.Engineering College, Latur, India Human computer interaction is an interesting topic in artificial intelligence. Artificial navigation is an interesting application of human computer interaction, which control action of target device by speech or gesture information. Artificial navigation allows us to control the target device within a distance without any remote control device. With the development of science and technology, human computer interaction gradually changes from computer centered as center into people-centered. Many techniques have been developed to control robotic system using various human computer interaction mechanisms. Gesture and voice as People's Daily communication method, is natural, intuitive and clear, So gesture and voice become mainstream of human-computer interaction. This project presents a wireless interface to control robot using voice and gesture commands through a computer. Also this project contains developing an android application for remote operation. Commands are sent to the receiver to control the movement of the robot either to move forward, backward and left or right etc using android application device. Remote operation is achieved by any smart-phone/tablet etc., with android OS; upon a GUI (Graphical User Interface) based touch screen operation. Keywords: Human computer Interaction (HCI), GUI (Graphical User Interface), Artificial intelligence, Artificial navigation, Voice Recognition, Gesture Recognition. INTRODUCTION Robot is an integral part in automating the flexible manufacturing system The way humans interact with computers is constantly evolving, with the general purpose i.e., being to increase the efficiency and effectiveness by which interactive tasks are completed. This project presents a wireless interface to control a robot using gesture and voice commands. The robot is also controlled by using an android application which can be installed in our mobile which has android base, which is used to control robot s movement. The purpose of this review is to introduce the field of gesture recognition as a mechanism for interaction with computers. Gestures are expressive, meaningful body motions involving physical movements of the finger, hands, arms, head, face, or body with the intent of: 1) conveying meaningful information or 2) interacting with the environment. Hand gesture recognition finds applications in varied domains including virtual environments, smart surveillance, sign language translation, medical systems etc. Hand gestures are an attractive method for communication 11

with the deaf and dumb. Hand Gestures can be used for remote controls for television sets, Stereos and room lights, Household robots could be controlled with hand gestures. Voice recognition is a process of taking the spoken words as an input to a computer program. This process is important to virtual reality because it provides a fairly natural and intuitive way of controlling the simulation while allowing the user s hands to remain free. A robot with voice recognition is to create a wireless voice controlled robot which can be operated through a range of 10 to 50 meters using transmitter and receiver. It acquires its information using a speech recognizer. After processing of the speech, the necessary motion instructions are given to the robotic platform via a RF link. The speech recognition software running on a PC is capable of identifying the voice commands issued by authenticated user. The robot can also be moved by using android application which is installed in mobile or tablet having android base. In today s age, the robotic industry has been developing many new trends to increase the efficiency, accessibility and accuracy of the systems. Robots can be a replacement to human; they still need to be controlled by humans itself. Robots can be wired or wireless, both having a controller device. This project falls under three domains; Real time image processing, Robotics and Wireless Communication. The Prominent benefit of such system is that it presents natural way to send information to the robot. LITERATURE SURVEY In recent years, hand gesture recognition is gaining great importance in human-computer interaction (HCI) and human-robot interaction (HRI). Different approaches have appeared making use of different sensors and devices. A method for realizing this task was implemented using accelerometer. This approach requires predefined data which includes the maximum and minimum value corresponding to gesture so that real time hand gesture can be compared with it. In another method, the pattern matching method involves comparing the current value of global maxima and global minima generated with previously stored value. Another method for gesture recognition was implemented using Microsoft s Kinet sensors; such sensors are capable of capturing both RGB and depth data. This approach involves looking at specific hand motion in addition to full body motion for more refined motion. However, this method is quite expensive due to high cost of Kinet sensor. The implementation of hand detection for human computer interaction was implemented using open CV as a tool where count was generated by the convexity defects by drawing a contour of the hand and developing its convex hull using image processing. A humanoid robot controlled by body gesture and speech was developed by using Kinet sensor and calculating the angles between joints of the body gesture generated by human. Robot navigated by flex sensor was developed for military purpose. The glove limits the free movement of hand. Gesture controlled robot using Wi-Fi shield to wirelessly control a robot using thresholding, contour and convex hull was developed. A robotic arm whose movements were wirelessly controlled by gesture recognition using colour recognition was presented. A mathematical approach for calculating the gesture given was put forward. This involved 12

calculating centroid of palm and then making a circular region of specific radius around it so that the number of fingers can be counted. Hand wearable devices such as sensor gloves have been used although they are usually expensive and user intrusive. Other less intrusive wireless devices like the Wi-Fi controller or sensing rings have appeared to overcome these drawbacks. Cameras and computer vision have proved to be useful tools for this task. In addition, other contact-free sensors have emerged lately to detect hand motion and interact with different devices. However, despite all the previous work, a reasonable solution to the gesture recognition problem has not been found yet. Nevertheless, even with the appearance of these new sensors, finding and segmenting the hand of the user in an image is still a meaningful problem. It remains unsolved especially in situations where there are occlusions, different lighting conditions or when other skin-coloured objects apart from the hand appear in the scene. In the last years, hand gesture recognition applications have focused on the recognition problem itself, simplifying the problem of finding the user s hand. Common simplifications are the assumption of some particular situations like the hand being the front-most object or the use of full-body tracking algorithms. Under these assumptions, different gesture classification methods such as Hidden Markov Models, k-nearest Neighbours, Template Matching or Finite State Machines have reached high classification rates. PROPOSED SYSTEM A. Robot controlled by Gesture Information 1. Gesture Signals In order to communicate between human and robot we make use of hand gestures. These gestures are then programmed in a way so as to generate commands for a robot to move forward, backward, right, and left. More number of gestures can be incorporated for navigating in different directions. Gesture is an analog activity that can be acquired using various sensors. Here camera is used as a sensor for capturing gesture. 2. Gesture Signals Processing The following flowchart explains the flow of gesture capture, processing and recognition. It starts with capturing image of the gesture. Then the processing of the gesture takes place by comparing the gesture with the given database. According to that it will generate command signal. This signal is given to the robotic system and robot moves in desired direction. 13

Capture Image of gesture BLOCK DIAGRAM Processing of image Command signal Generation Transmitting signal Wirelessly to robot Robot Moves in Desired Direction Fig.1 flow chart for Gesture Recognition B. Robot Controlled by Speech Signal In voice recognition system, speech technology allows computers equipped with a source of sound input, such as a microphone, to interpret human speech. Commands implemented are: Forward, Reverse, Left, Right, stop (for body movement), Up, Down (for Arm movement), Open and close (for gripper movement) 14

Start Read the sound Compare sound with the Database Match with any command Send the instruction to the port End Fig.2 Flow chart for Speech Recognition C. Robot Controlled By Android App Commands are sent to the receiver to control the movement of the robot either to move forward, backward and left or right etc using android application device. Four motors are interfaced to the microcontroller where two motors are used for arm and gripper movement of the robot while the other two motors are used for the body movement. The android application device transmitter acts as a remote control that has the advantage of adequate range, while the receiver end Wi-Fi device is fed to the microcontroller to drive the DC motors via motor driver IC for necessary work. Remote operation is achieved by any smart-phone/tablet etc., with android OS; upon a GUI (Graphical User Interface) based touch screen operation. 15

tarm +Gripper Motor Motor Wi-Fi Module Android Application Device International Journal of Advances in Engineering Research Battery Regulator Motor Driver Microcontrolle r BLOCK DIAGRAM Fig.3 Block Diagram for robot controlling using Android App PIR sensor WIFI Modem ARM 7 Micro Controller LCD DC Motor driver R o b o RF Mo DC Motor Driver dem WI WIFI Modem Android App RF Modem PC Fig.4 Block Diagram of a robot controlled using Gesture speech and Android App 16

HARDWARE IMPLEMENTATION 1. ARM7 LPC2138 The NXP (founded by Philips) LPC2138 is an ARM7TDMI-S based high-performance 32-bit RISC Microcontroller with Thumb extensions 512KB on-chip Flash ROM with In-System Programming (ISP) and In-Application Programming (IAP), Two 8-ch 10bit ADC 32KB RAM, Vectored Interrupt Controller, Two UARTs, one with full modem interface. Two I2C serial interfaces, Two SPI serial interfaces Three 32-bit timers, Watchdog Timer, Real Time Clock with optional battery backup, Brown out detect circuit General purpose I/O pins. CPU clock up to 60 MHz, On-chip crystal oscillator and On-chip PLL. 2. WIFI MODULE Wireless Local Area Network (WLAN) built on the IEEE 802.11 standards. Wi-Fi is the name of a popular wireless networking technology that uses the radio waves to provide the wireless high speed Internet and network connections. Wi-Fi communication can transmit and receive radio waves. A wireless router receives the signal and decodes it. The router sends the information to the Internet Ethernet connection. A computer wireless adaptor translates data into a radio signal and transmits it using an antenna. 3. RF MODULE An RF module (radio frequency module) is a (usually) small electronic device used to transmit and/or receive radio signals between two devices. In an embedded system it is often desirable to communicate with another device wirelessly. 4. LCD DISPLAY A liquid-crystal display (LCD) is a flat-panel display or other electronically modulated optical device that uses the light-modulating properties of liquid crystals. 5. MOTOR DRIVER IC A motor driver IC is an integrated circuit chip which is usually used to control motors in autonomous robots. Motor driver ICs act as an interface between microprocessors in robots and the motors in the robot. 6. DC MOTOR A DC motor is any of a class of rotary electrical machines that converts direct current electrical energy into mechanical energy. The most common types rely on the forces produced by magnetic fields. 7. POWER SUPPLY A power supply is an electronic device that supplies electric energy to an electrical load. The primary function of a power supply is to convert one form of electrical energy to another. As a result, power supplies are sometimes referred to as electric power converters. 8. PC It contains MATLAB programs for Gesture and Speech recognition. It is interfaced to the microcontroller using RF modem 9. ANDROID APP It may be mobile phone or tablet which is having android base 17

First image of gesture is captured by the camera located on a laptop or PC. Then according to the need, extraction of useful information from image is done by performing image processing on the gesture. This is done by doing image segmentation and image morphology. According to the gesture shown, when it is interpreted by the use of the algorithm, the command signal is generated. Then this signal is transmitted wirelessly to robotic system, which moves according to desired direction. Here in our system robot get controlled from remote location in addition to remote monitoring. For controlling the robot we have the following Methods: A. Android Based GUI control An android APP is developed through which the user can move the Robot Forward, Reverse left and right. The microcontroller is interfaced to the android APP via WIFI modem and gets all the input and controls the robot according to it. B. Android Based Voice recognition control There is a MATLAB program which is used to move the Robot Forward, Reverse left and right using Voice recognition. The APP will recognize the commands and send it to µc. The microcontroller is interfaced to the PC via RF modem and gets all the input and controls the robot according to it. C. MATLAB Based Hand Gesture recognition There is a MATLAB program for hand based gesture recognition which is used to move the Robot Forward, Reverse left and right. For this the SURF ALGORITHM is used. The MATLAB code will recognize the hand gesture and send the corresponding movement commands to µc using an RF Modem. The microcontroller is interfaced to the MATLAB via RF modem and gets all the input and controls the robot according to it. The Robot will have the PIR sensor, which will help to detect the alive human beings. If any abnormal conditions occur while sensing means, it s intimated through LCD and Buzzer unit in the control section. PIR (Passive infra red sensor) sensor is interfaced to detect any presence of humans. Once human is detected we can confirm the identity by Video surveillance. I. HARDWARE REQUIRED: 1. ARM7 2. LCD 3. L293D (DC Motor driver),12v DC motor 4. Android APP (Developed in Basic for Android software) II. SOFTWARE REQUIRED: 1. KEIL Compiler 2. Flash magic 3. Basic for android (APP development) 4. MATLAB 1 III. ADVANTAGES 1. Efficient design to control and Monitor 2. Highly flexible 3. Quick response time 18

4. Fully automate system thus Reduces human efforts 5. Robust system 6. Less Corruption IV. APPLICATION AREAS 1. This robot is mostly used in defense for spying enemy and protect from obstacles. 2. The main objective for developing this application is that, it can provide the user with security of data. 3. Only the authorized user and administrator can access the application. 4. A person from a remote place can comfortably control the motion of robotic arm by using voice & gesture recognition 5. In an Industrial Area where the worker can t handle the harmful equipments. 6. In an Industrial Area to spy on the workers through the camera. 7. In Houses for paralysis persons to identify the object and handle them. 8. In mining industries. CONCLUSION The Gesture Controlled Robot System gives an alternative way of controlling robots. Gesture control being a more natural way of controlling devices makes control of robots more efficient and easy. Robot gets controlled by using voice command as well. It can understand any human voice; it is not single speaker dependent.. But it is sensitive to the surrounding noises. We also proposed a simple algorithm for hand gesture recognition. Also going to implement a robot which can be controlled wirelessly by an android device. FUTURE OF THIS PROJECT 1. Implement uplink communication from the Robots to GUI Application through the Base Station. 2. Control up to 10 Robots from the GUI Application through the Base Station. 3. Use a secured wireless channel using encryption and decryption. 4. Consider larger bandwidth system should be onboard because video streaming service desired. 5. More gestures can be involved for greater variety of directions. 6. Speed of response can be increased by faster computational software. 7. The arm can be involved upon for greater load of obstacle tackling. REFERENCES [1] Sushmita Mitra, Tinku Acharya Gesture Recognition: A Survey [J]. IEEE Trans. on Systems, Man and Cybernet ics-part C Applications and Reviews, 2007, 37(3) 311-324. [2] Stefanov N, Galata A, Hubbold R. A real-time hand tracker using variable-length Markov models of behaviour [J]. Computer Vision and Image Understanding, 2007, 108(1-2):98-115 19

[3] Vladimir I. Pavlovic, Rajeev, Sharma, and Thomas S. Huang, Fellow, Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review, IEEE transactions on pattern Analysis and Machine Intelligence, vol. 19, no. 7, July 1997. [4] Real Time Robotic Hand control using Hand Gesture, Second International Conference on Machine Learning and Computing, 2010. [5] Jegede Olawale, Awodele Oludele, Ajayi Ayodele, Development of a Microcontroller Based Robotic Arm, in processing of 2007 Computer Science and IT Education Conference 20