Intelligent Tactical Robotics

Similar documents
MULTI ROBOT COMMUNICATION AND TARGET TRACKING SYSTEM AND IMPLEMENTATION OF ROBOT USING ARDUINO

Hardware Implementation of an Explorer Bot Using XBEE & GSM Technology

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

UTILIZATION OF ROBOTICS AS CONTEMPORARY TECHNOLOGY AND AN EFFECTIVE TOOL IN TEACHING COMPUTER PROGRAMMING

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

Pick and Place Robotic Arm Using Arduino

Performance Analysis of Ultrasonic Mapping Device and Radar

Voice Guided Military Robot for Defence Application

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

OPEN CV BASED AUTONOMOUS RC-CAR

Rudimentary Swarm Robotics

Formation and Cooperation for SWARMed Intelligent Robots

A Model Based Approach for Human Recognition and Reception by Robot

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

Park Ranger. Li Yang April 21, 2014

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

Australian Journal of Basic and Applied Sciences

UNIVERSIDAD CARLOS III DE MADRID ESCUELA POLITÉCNICA SUPERIOR

Smart Phone Based Assistant System for Handicapped/Disable/Aged People

RCJ Rescue B. RCJ Rescue B Primary Team Branchburg, NJ USA. Storming Robots in Branchburg, NJ, USA. SR-chitect / Storming Robots

ILR #1: Sensors and Motor Control Lab. Zihao (Theo) Zhang- Team A October 14, 2016 Teammates: Amit Agarwal, Harry Golash, Yihao Qian, Menghan Zhang

SMART VIBRATING BAND TO INTIMATE OBSTACLE FOR VISUALLY IMPAIRED

Visvesvaraya Technological University, Belagavi

Solar Powered Obstacle Avoiding Robot

International Journal of Advance Engineering and Research Development. Zig Bee Based Human Sensing Robot using Embedded Systems

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Autonomous Obstacle Avoiding and Path Following Rover

Mobile Robots Exploration and Mapping in 2D

Surveillance and Target Engagement using Robots

Multi-Agent Robotics with GPS Navigation

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Final Report. Chazer Gator. by Siddharth Garg

II. LITERATURE REVIEW

Design of Tracked Robot with Remote Control for Surveillance

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

Closed-Loop Transportation Simulation. Outlines

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

EEL5666C IMDL Spring 2006 Student: Andrew Joseph. *Alarm-o-bot*

Modern Robotics with OpenCV. Widodo Budiharto

Today s Menu. Near Infrared Sensors

Arduino Based Robot for Pick and Place Application

Wireless Controlled Residential Air Vent: A Smartphone Interface for Air Direction

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

DESIGN AND IMPLEMENTATION OF A REMOTELY CONTROLLED MOBILE RESCUE ROBOT

Advanced Mechatronics 1 st Mini Project. Remote Control Car. Jose Antonio De Gracia Gómez, Amartya Barua March, 25 th 2014

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Designing of a Shooting System Using Ultrasonic Radar Sensor

Gesture Based Smart Home Automation System Using Real Time Inputs

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

Obstacle Avoidance Mobile Robot With Ultrasonic Sensors

Available online at ScienceDirect. Procedia Computer Science 76 (2015 )

DATA ACQUISITION SYSTEM & VISUAL SURVEILLANCE AT REMOTE LOCATIONS USING QUAD COPTER

SPY ROBOTIC MODULE USING ZIGBEE

Object Detection for Collision Avoidance in ITS

Thermo-Vision Scanning Of Vital Nodes in a Substation Using Image Processing

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

Zig-Bee Robotic Panzer

Development of a Robotic Vehicle and Implementation of a Control Strategy for Gesture Recognition through Leap Motion device

Fully Autonomous Flammable Gases (Methane Gas) Sensing and Surveillance Robot

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

Multi-Robot Cooperative System For Object Detection

Team KMUTT: Team Description Paper

Gesture Controlled Car

WIRELESS ROBOT FOR COAL MINES BASED ON MIXED SIGNAL PROCESSOR (MSP430)

LABORATORY AND FIELD INVESTIGATIONS ON XBEE MODULE AND ITS EFFECTIVENESS FOR TRANSMISSION OF SLOPE MONITORING DATA IN MINES

AUTOMATED ELEPHANT TRACKER

NUST FALCONS. Team Description for RoboCup Small Size League, 2011

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Learning Algorithms for Servomechanism Time Suboptimal Control

CORC 3303 Exploring Robotics. Why Teams?

Non-Data Aided Doppler Shift Estimation for Underwater Acoustic Communication

BEYOND TOYS. Wireless sensor extension pack. Tom Frissen s

Distance Measurement of an Object by using Ultrasonic Sensors with Arduino and GSM Module

INTERNATIONAL JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY (IJEET) TWO WHEELED SELF BALANCING ROBOT FOR AUTONOMOUS NAVIGATION

Training Schedule. Robotic System Design using Arduino Platform

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

ECE 511: MICROPROCESSORS

GROUP BEHAVIOR IN MOBILE AUTONOMOUS AGENTS. Bruce Turner Intelligent Machine Design Lab Summer 1999

Motion Capture for Runners

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

NOTE TO COIN EXCHANGER WITH FAKE NOTE DETECTION

CHAPTER 4 ANALYSIS AND DESIGN

BUILDING A SWARM OF ROBOTIC BEES

INTELLIGENT SEGREGATION SYSTEM

Handling Failures In A Swarm

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Automobile Prototype Servo Control

A Wireless Smart Sensor Network for Flood Management Optimization

DC Motor Control using Fuzzy Logic Controller for Input to Five Bar Planar Mechanism

A Simple Design of Clean Robot

Ray-Tracing Analysis of an Indoor Passive Localization System

Marine Debris Cleaner Phase 1 Navigation

Robotics Challenge. Team Members Tyler Quintana Tyler Gus Josh Cogdill Raul Davila John Augustine Kelty Tobin

International Journal of Advance Engineering and Research Development

DESIGN AND DEVELOPMENT OF LIBRARY ASSISTANT ROBOT

Robots in Town Autonomous Challenge. Overview. Challenge. Activity. Difficulty. Materials Needed. Class Time. Grade Level. Objectives.

Devastator Tank Mobile Platform with Edison SKU:ROB0125

MRS: an Autonomous and Remote-Controlled Robotics Platform for STEM Education

Transcription:

Intelligent Tactical Robotics Samana Jafri 1,Abbas Zair Naqvi 2, Manish Singh 3, Akhilesh Thorat 4 1 Dept. Of Electronics and telecommunication, M.H. Saboo Siddik College Of Engineering, Mumbai University 2,3,4 BE- Electronics and telecommunication, M.H. Saboo Siddik College Of Engineering, Mumbai University Abstract In order to carry out tasks which are mundane or are repeated and mechanized in some manner, we can make use of robots to help us achieve the desired objective. Robotics is a field of science and technology which is very popular because of its immense potential in getting the job done efficiently and precisely. Researchers and engineers are continuously working hard to explore this field and make use of its immense potential to the fullest. It has long been recognized that there are several tasks that can be performed efficiently in coordination with multiple robots and that is why collaborative robotics is where many innovations are taking place. In this work it is proposed that a concept to initiate communication between two robots which can be done with the help of the master-slave concept, thus implementing collaborative robotics. The concept is collaboration of various algorithms and technologies which will execute different tasks in coordination and also the existing standard algorithm has been modified in a specified way to carry out the desired task. Index Terms Robot prototype, Master-Slave, Arduino, Object recognition, X-Bee, Robotic Arm. I. INTRODUCTION A. Robotics Robotics is branch which was developed to ease the life of humans by effectively performing the task at a faster rate which human fail to do so. Collaborative Robots are those robots that are designed to work in collaboration with a human or a robot. It is an extension to the primary field of robotics which makes use of coordination between machines and humans to get the specified task done successfully. Robotic teams/collaborative robotics has several advantages over single-robot operations. There are some problems that require multiple robots to participate and thus require cooperation of several working agents. There are various tasks that require completion within a specified time frame and here a cooperative approach would result in the completion of task in a more efficient manner. Having said that, collaborative robots are also more efficient and resistant to failure compared to mainstream robotics[1]. There has been extensive study and literature on control and coordination for multiple mobile robots, its application in tasks such as exploration, surveillance, rescue, mapping of environments and transportation of objects. To design and implement autonomous robots which are programmed to perform various tasks in a collaborative manner and design would surely be a task worth completing. In our system, the idea or the concept of Master-Slave system is used in which the slave would receive the commands or orders from the master and thus would complete the assigned task. @IJRTER-2016, All Rights Reserved 190

B. Object recognition Object recognition is the feature that the Master robot uses to distinguish between an obstacle and a target. The target object is marked with distinct pattern, or can have a distinct shape and colour, and the Master robot uses the object/pattern recognition technique to distinguish whether the object is an obstacle or target. This decision which will be taken by the master will determine whether to clear the object in case of an obstacle or grab the object in case the object is a target. There are many techniques used to recognize patterns which include edge detection, corner detection, angle detection etc. Out of these above mentioned techniques, the system uses edge detection technique. An edge is the boundary between an object and the background, and indicates the boundary between overlapping objects. This means that if the edges in an image can be identified accurately, all of the objects can be located and basic properties such as area, perimeter, and shape can be measured. Since a computer vision involves the detection, identification and classification of objects in a captured image, edge detection is an essential tool [2]. There are multiple edge detection techniques available out of which canny edge detection is best suited for our system and thus tis technique has been used. A typical example of the edge detected image is shown in Fig.1. Fig. 1 Example of edge based object C. Wireless Communication The communication between the master and computer and the master and slave and vice-versa has been done using a wireless RF 2.4GHz module. One of the modules which satisfy our requirements is Digi s International s X-BEE module. Xbee s are a family of low power, low cost and low bandwidth wireless communication modules. They work on Zigbee communication protocol. Xbee s have been used instead of Bluetooth communication system and Infra-red communication systems because the later are less reliable and have a small range when compared to Xbee s. Our System requires a highly reliable and efficient wireless communication system and that need is satisfied by Xbee s. Fig.2 shows an actual image of Xbee modules. @IJRTER-2016, All Rights Reserved 191

Fig. 2 Xbee D. Programming using Arduino: To interface all the hardware with software signals, wireless modules and feedback networks, Atmega microcontrollers, programmed using Arduino environment are used. We use Arduino boards as they are very functional and are open source. This has an added advantage of allowing standalone circuits to be made. Arduino boards come with 6 analog input and 6 output ports. It has 6 digital I/O ports, one transmitter and one receiver port. The integration of Xbee s, communication between them, is made possible using Ardunio boards. E. Distance Measurement and sensing: To maintain constant distance between the master the slave and to detect the target or the obstacle, Ultrasonic sensors are used. Ultrasonic sensors use sound waves rather than light, making them ideal for stable detection of uneven surfaces, liquids, clear objects, and objects in dirty environments. These sensors work well for applications that require precise measurements between stationary and moving objects. Ultrasonic ranging module HC - SR04 provides 2cm - 400cm non-contact measurement function. The modules includes ultrasonic transmitters, receiver and control circuit. Ultrasonic sensors are attached on the front end of both the Master and the Slave. The Master uses the sensors to sense the environment around it and halt before an object, the Slave uses ultrasonic sensors to maintain a constant distance between itself and the Master. F. Live image feed: To continuously monitor the environment and send a live feed to the Base station, FPV is used. First Person View (FPV) is a method used for viewing the live video. FPV used in our system works on 1.2GHz frequency, with CMOS camera attached as a transmitter to master and receiver to ground station as shown in fig. 5. The FPV will be attached on the Master and it continuously transmits data to the base station for it to compare the image sent with the existing database stored in the base station. The FPV is attached at the front end of the Master and is connected to Xbee via Arduino to continuously send information. @IJRTER-2016, All Rights Reserved 192

Fig. 3 First person view G. Robotic Arm: To grab the target object or to clear/sweep an obstacle in the path of the Master, the Slave is equipped with a robotic arm. The robotic arm uses servo motors at the shoulder end as well as at the grabber end also. The master uses its FPV to scan the environment around it and compare the objects scanned with the existing database. When the object encountered is an obstacle, the master orders the slave to clear the path using the robotic arm and when the object encountered is the target, the Master orders the Slave to grab the object and place it on its chassis. The robotic arm has a 360 degree base movement which enables it to access greater area to grab or clear the object. Fig. 4 shows the robotic arm. The base motor has a torque of 5Kgs, thus the arm can also have multiple axis. Fig. 4 Robotic arm H. Robot Chassis Design: The design of the chassis of both the Master and the Slave are of prime importance. The Robots should have agility to take quick turns and at the same time should be strong enough to carry weight of the target. The material, shape, size, weight measurements are to be taken into serious consideration. The Master and Slave used in the system have circular design which allows the robots to maneuver freely in any direction. The material used is acrylic which is light in weight and has considerable strength to carry @IJRTER-2016, All Rights Reserved 193

the target load. The acrylic thickness is 5mm which makes it light and thick enough to carry the load. The heavier the robot chassis, the more the chances of their arising problems with the motor driver, thus proper weight considerations were made. To reduce weight and size and yet have enough space to have all components on the chassis, layered approach was used in the Master design. Fig. 6(a), 6(b) show the images of the Master and the Slave. Fig. 6(a) Chassis design of Master using layered approach Fig. 6(b) Slave Chassis with arm attached II. OBJECTIVES FORMULATED The following were the objectives or the system development steps: 1. Studying of Xbee modules and various image processing techniques. 2. Establishing a connection between the wireless modules (Xbee s). 3. Writing the MATLAB code for object detection with multiple parameters. 4. Programming the controllers, Arduino, to establish communication between the Master and the Slave. 5. Designing of Robot chassis with proper size, weight and shape considerations. 6. Designing of Robotic arm taking into account the weight and material to be used. 7. Writing the MATLAB code for robotic arm movement and integration. 8. Integration of all the hardware components. 9. Initial testing of the system. @IJRTER-2016, All Rights Reserved 194

10. Rectification of errors encountered and removal of software bugs. 11. Final implementation of the desired collaborative robotic system. III. METHODOLOGY To carry out the above mentioned steps, the following methodology was used: 1. The proposed system comprises of one master robot that will guide a slave robot to the target and in turn would lead to completion of a task. 2. The task which is to be performed by the master is to identify the target and command the slave to either to clear the obstacle or pick the target using the robotic arm. 3. The master has a FPV camera attached to it and it continuously monitors the environment and scans for potential targets. 4. Once an object is scanned, the image of the scanned object is sent to the base station for comparison between the target image stored in the database and the image captured. 5. The way this is done is that the image acquired by the video transceiver is sent to the base station where via MATLAB, image is processed. 6. Based on the results of the image processing, MATLAB generates a value that is sent to the Master via Xbee. 7. MATLAB is interfaced with Xbee and the Xbee sends command to the Master, which has another Xbee interfaced with it. 8. Based on the command, if the object is the target, the master commands the slave to pick up the target using its robotic arm, else if the object is an obstacle, it commands the slave robot to clear the obstacle by moving it aside using its robotic arm. A flowchart showing the above mentioned process is shown in Fig. 7 below. Fig. 7 Flowchart of Master-Slave task 9. All communication between Base station, Master and slave is done via Xbee s. @IJRTER-2016, All Rights Reserved 195

10. The slave is in direct contact with the master. The base station does not instruct the slave; only the master will be able to do that. 11. Once the task is completed, the master and the slave will return to the original starting points. 12. The individual working mechanisms of both the Master and the Slave are shown in Fig. 8 and Fig. 9 respectively. Fig. 8 Working diagram of Master Fig. 9 Working mechanism of Slave IV. RESULTS The above explained system of collaborative robotics is an integration of various technologies including both hardware and software. Careful planning and execution was required to achieve the desired results. The Master and Slave work in synchronization using the ultrasonic sensors as the range and object detectors and communicate with each other and the base station with the help of Xbee s. The object is recognized as a target or as an obstacle with the help of image processing which compares the image using different parameters like shape and colour. The Slave carries out the task which is ordered by the Master and it uses its robotic arm to do the same. After the successful completion of the task, the Master and Slave return back to the base point and await further instructions from the base station. To assist the whole process, Arduino boards are used for @IJRTER-2016, All Rights Reserved 196

interfacing of the Xbee s and to run the motors. All the integration of software takes place using MATLAB. V. CONCLUSION In this work honest efforts have been made to develop a system of collaborative robots with varied functionalities to carry out tasks with high precision and efficiency. The ability of the Master to make the decision and command the Slave makes the system an intelligent one. Through this work an attempt has been made in the direction of achieving better results than the preexisting technologies and also making scope for useful future applications. The future applications could be in the field of defense and security, mapping of unexplored territory or rescue operations. This work, not to forget, is a prototype, thus is bound to have limitations and inaccurate results. Having said that, it is always open for improvement and can result in a much more advanced and enhanced version. The use of digital mapping, multiple axis arm structure, increase in the number of co-bots are just a few examples that show us in what ways the system can be improved or worked upon in the near future. Collaborative robotics is an interesting alternative to classical approaches to robotics because some properties of problem solving by social insects, which are flexible, robust, decentralized and self- organized can be made use by human beings. REFERENCES 1. Alejandro R. Mosteo, Luis Montano, A survey of multi-robot task allocation in Aragon Institute of Engineering Research (I3A), University of Zaragoza, Mariano Esquillor s/n, 50018, Zaragoza, Spain, pages 524-533, 2001. 2. EhsanNadernejad, Edge Detection Techniques: Evaluations and Comparisons, in Department of Computer Engineering, Faculty of Engineering Mazandaran Institute of Technology, Applied Mathematical Sciences, Vol. 2, 2008, no. 31, 1507 1520. 3. A. Toet, Image fusion by a ratio of low pass pyramid in Pattern Recognition Letters, vol. 9, no. 4, pp. 245-253, 1989 4. G.-D. Li, S. Masuda, D. Yamaguchi and M. Nagai, The optimal GNN-PID control system using particle swarm optimization algorithm, International Journal of Innovative Computing, Information and Control, vol.5, no.10(b), pp.3457-3470, 2009. 5. C.M.Sheela Rani, V.VijayaKumar, B.V.RamanaReddyImproved Block Based Feature Level Image Fusion Technique Using Multiwavelet with Neural Network International Journal of Soft Computing and Engineering (IJSCE) ISSN: 2231-2307, Volume-2, Issue-4, September 2012 6. Block feature based image fusion using multi wavelet transforms MaziyarKhosravi, Mazaheri Amin - International Journal of Engineering Science and Technology (IJEST) - ISSN: 0975-5462 Vol. 3 No. 8 August 2011, 6644. @IJRTER-2016, All Rights Reserved 197