VOICE CONTROL BASED PROSTHETIC HUMAN ARM

Similar documents
VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

KINECT CONTROLLED HUMANOID AND HELICOPTER

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Automobile Prototype Servo Control

II. LITERATURE REVIEW

Human Robot Interaction

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Pick and Place Robotic Arm Using Arduino

Birth of An Intelligent Humanoid Robot in Singapore

ROMEO Humanoid for Action and Communication. Rodolphe GELIN Aldebaran Robotics

Design and Control of an Anthropomorphic Robotic Arm

Milind R. Shinde #1, V. N. Bhaiswar *2, B. G. Achmare #3 1 Student of MTECH CAD/CAM, Department of Mechanical Engineering, GHRCE Nagpur, MH, India

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar

Robotics Laboratory. Report Nao. 7 th of July Authors: Arnaud van Pottelsberghe Brieuc della Faille Laurent Parez Pierre-Yves Morelle

Live. With Michelangelo

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Robot: icub This humanoid helps us study the brain

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

WIRELESS VOICE CONTROLLED ROBOTICS ARM

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

Sensor system of a small biped entertainment robot

Emoto-bot Demonstration Control System

Live. With Michelangelo

Mechatronic Design, Fabrication and Analysis of a Small-Size Humanoid Robot Parinat

Tele-Operated Anthropomorphic Arm and Hand Design

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

A Semi-Minimalistic Approach to Humanoid Design

WIRELESS CONTROL OF A ROBOTIC ARM USING 3D MOTION TRACKING SENSORS AND ARTIFICIAL NEURAL NETWORKS 13

Intelligent Tactical Robotics

Robo-Erectus Tr-2010 TeenSize Team Description Paper.

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Chapter 1 Introduction

Baxter Safety and Compliance Overview

Introduction: Components used:

Implementation of Face Detection and Recognition of Indonesian Language in Communication Between Humans and Robots

Preliminary Design Review

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

Design and Implementation of a Simplified Humanoid Robot with 8 DOF

Control of the Robot, Using the Teach Pendant

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

Korea Humanoid Robot Projects

Voice Controlled Intelligent Wheelchair using Raspberry Pi

Programming of Embedded Systems Uppsala University Spring 2014 Summary of Pan and Tilt project

Multipurpose Iron Man Glove & Moveable Platform

Job Sheet 2 Servo Control

MASTER SHIFU. STUDENT NAME: Vikramadityan. M ROBOT NAME: Master Shifu COURSE NAME: Intelligent Machine Design Lab

Project Number: P13203

Tele-operated robotic arm and hand with intuitive control and haptic feedback

Advanced Android Controlled Pick and Place Robotic ARM Using Bluetooth Technology

Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

Proprioception & force sensing

Affordance based Human Motion Synthesizing System

EVALUATING THE DYNAMICS OF HEXAPOD TYPE ROBOT

Introduction to Robotics

International Research Journal of Engineering and Technology (IRJET) e-issn: Volume: 05 Issue: 06 June p-issn:

HUMAN COMPUTER INTERFACE

A NOVEL CONTROL SYSTEM FOR ROBOTIC DEVICES

A comparisonal study on robot arm in terms of light weight handling

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

League <BART LAB AssistBot (THAILAND)>

Familiarization with the Servo Robot System

MECHATRONICS SYSTEM DESIGN

3-Degrees of Freedom Robotic ARM Controller for Various Applications

Arduino Based Robot for Pick and Place Application

Low cost underwater exploration vehicle

Introduction to robotics. Md. Ferdous Alam, Lecturer, MEE, SUST

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

Mechatronics. STEAM Clown Production. STEAM Clown & Productions Copyright 2016 STEAM Clown

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Design of a Bionic Hand Using Non Invasive Interface

EXOBOTS AND ROBONAUTS: THE NEXT WAVE IN THE SEARCH FOR EXTRATERRESTRIALS

Peter Berkelman. ACHI/DigitalWorld

U ROBOT March 12, 2008 Kyung Chul Shin Yujin Robot Co.

PICK AND PLACE HUMANOID ROBOT USING RASPBERRY PI AND ARDUINO FOR INDUSTRIAL APPLICATIONS

MECHATRONICS SYSTEMS

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

RAPID PROTOTYPING AND EMBEDDED CONTROL FOR AN ANTHROPOMORPHIC ROBOTIC HAND

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Devastator Tank Mobile Platform with Edison SKU:ROB0125

AI Application Processing Requirements

Realization of Humanoid Robot Playing Golf

Design and Control of the BUAA Four-Fingered Hand

Dr. Ashish Dutta. Professor, Dept. of Mechanical Engineering Indian Institute of Technology Kanpur, INDIA

Mechatronics Project Report

Design and Development of Novel Two Axis Servo Control Mechanism

Motion Control of Excavator with Tele-Operated System

Midway Design Review

System Overview of The Humanoid Robot Blackmann

Control System Design for Tricopter using Filters and PID controller

Mechatronics Engineering and Automation Faculty of Engineering, Ain Shams University MCT-151, Spring 2015 Lab-4: Electric Actuators

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Transcription:

VOICE CONTROL BASED PROSTHETIC HUMAN ARM Ujwal R 1, Rakshith Narun 2, Harshell Surana 3, Naga Surya S 4, Ch Preetham Dheeraj 5 1.2.3.4.5. Student, Department of Electronics and Communication Engineering, PES University, Bangalore, India ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract In the present world, the number of amputee cases is rising every year, which needs to be resolved. Currently many different types of prosthetic arm, which are medically was first shown here. S. Mohamed Sulaiman et al. in their paper [1] have described the design of a three fingered robotic upper limb that can take human voice certified, are around the market. These are either too commands as inputs. They configured the elbow joint. expensive or don t satisfy the needs of the patients to the R. Aswin Balaji et al. [2]ventured a method to simplify fullest. In this paper, we provide a technological advancement robot programming. Their goal was to make the for the arm by enabling voice control and even manage to cut complex technical languages used for robot down the cost of the electronic and mechanical equipment required in building a working prototype of the prosthetic programming more intuitive, easier, and faster to arm. Our prototype resembles the functional structure of the grasp. Abhinav Salim et al. [4] presented a design of a biological human arm. Most of the complex movements of the working robotic arm which takes in voice/speech arm and hand is made possible by achieving near perfect signals using a speech processing unit and replication of the movements of the biological human arm. The microcontroller. They pre-coded the necessary joints of the fingers on the prosthetic arm have been modelled movements of the motors to perform different tasks. A based on the biological human fingers to replicate all the speech recognition module was trained for recognizing actions typically obtainable by any human finger. The the inputs like move forward etc. and executed in realtime. Young June Shin et al. [5] demonstrated high prototype of the prosthetic arm presented here doesn t rely on the biological signals from the nerve endings of the residual performance in power and precision of a humanoid arm in the human body, hence it can easily replace the prosthetic arm which rely on phantom limb, as the human arm/hand by using three actuation principles - brain loses the phantom limb sensation after a period of about Electromagnetic joint lock mechanism, twisting six to eight months. In this paper, we have specifically tackled actuation and distributed actuation. this above problem and even provided an advancement in the We have taken inspiration from the aforementioned form of voice control commands to the robotic prosthetic arm works and designed a prosthetic arm that can do most which we modelled using economical devices and equipment of the typical tasks required by an amputee. It can to cut down the heavy cost of affording a prosthetic arm. further even be programmed to do things that a human arm cannot do. Key Words: Prosthetic arm, Voice control, Phantom limb, Nerve endings, Residual arm. 1.INTRODUCTION The prototype that is designed consists of 6 degrees of freedom from shoulder to wrist and 6 degrees of freedom on the palm along with the fingers. 2 degrees of freedom on the shoulder joint, 2 on the elbow and 2 on the wrist which gives the overall movement of the prosthetic arm similar to a human arm. Moving on to the fingers, thumb has 2 degrees of freedom and other fingers have 1 degree of freedom each. The fingers are made of 3 joints, the mechanism is further explained later. Brandi House et al. [3] presented in their work, the first working model of making a robotic arm do simple task such as moving a candy from initial position to the destination using voice commands. Modelling neural networks and developing algorithms to recognize human speech and using them for controlling robots 2. TECHNICAL WORKING 2.1 Block Diagram The user can control the prosthetic arm in the form of human voice commands. The voice commands are recorded using a microphone. The microphone is connected to the embedded system (Raspberry Pi). The Raspberry Pi (controller) converts the voice commands into text using Google's API and compares the text with the pre-coded commands, for example pick up or 'wave. The Raspberry Pi sends the position data to servo motors based on the voice command it received. If the voice is not audible or unclear there is another option, i.e., remote control. The servo motors give their current position as feedback to the embedded system and a distance sensor is used to detect the nearby objects to pick or drop them. This is put in the form of block diagram below in Fig -1. 2018, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 473

Fig -1: Flow Diagram 2.2 Structure and Working The prosthetic arm prototype modelled using 6 heavy and 6 mini servo motors is displayed in Fig -2. It has two servo motors to control the two degrees of the shoulder, one degree of freedom pitch - the up/down movement and the other degree of freedom roll - the forward/backward movement. There are two servo motors to control the motion of the elbow joint and two servo motors to control the wrist movement. The shoulder joint experiences the most torque among the six servos as it has the longest support to connect the motors increasing the torque experienced by the shoulder joints. This model uses 35 Kg torque servo motor at the shoulder. The elbow joints experience lesser toque compared to the shoulder. Hence the model uses 20Kg servo motors at the elbow joint. A small amount of torque is experienced at the wrist joint which only needs a 10Kg servo motors to provide a stable movement. All the six servo motors are controlled individually to complete the motion of the arm. At the palm of the arm, there is an ultrasonic sensor to detect the presence of an object to further consolidate the movement of the arm in a smooth and uninterrupted way. The working model of the finger joint movement and the embedded system of the prosthetic arm is discussed further. Fig -2: Prosthetic Arm Mechanical Structure A set of predefined motions are available on the arm linked to the voice commands. New commands and motions can be defined for an array of use cases based on the user needs. A distance sensor is placed on the palm of the arm to detect nearby objects and help the arm to pick the object easily if necessary. 2.3 Embedded System Fig -3: Embedded System Connections 2018, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 474

The voice input is taken into the Prosthetic arm i.e. the commands are picked up by the microphone connected to the embedded system. The embedded system consists of a Raspberry Pi for data processing and control of the arm and SIM 1200 model for providing the Raspberry Pi with constant network access. The data processing is done at this block with the data acquired from the microphone and the feedback data obtained from the Servo motors. The raspberry pi (Fig - 4) computes this data and passes on new set of data points for the movement of the servo motors to the desired location. 2.4 Control Logic Fig -4: Raspberry Pi The Raspberry pi waits for the signal from the user to record the voice commands. Raspberry pi starts recording the audio when the signal is given. Google API is used to convert the speech to text and the text is compared with the pre-saved commands. The position data is sent to the servo based on the command given by the user. The distance sensor is used to give feedback to the entire arm and is used to adjust the motion of the arm based on the surroundings. A detailed Control Flow block diagram is shown in Fig -5. Test case: If a Wave command is given to the system; the arm moves in a way which depicts a Wave motion. This is achieved by moving the arm to upper position by controlling the shoulder and elbow joints and moving the elbow roll axis in clockwise-direction up to 30 degree and in anti-clockwise direction up to 30 degrees. Fig -5: Control Flow 2018, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 475

2.5 Finger Joint Mechanism Fig -6: Finger Joint Mechanism In our prototype of the prosthetic arm, we were able to depict the near perfect movement of the human fingers by a unique mechanism. Fig -6 shows the overview of the mechanism used to achieve the finger movements. There are five fingers on the model. Each finger has a different servo controlling the finger movement. The thumb has 2 servos controlling the motion as there are 2 degrees of freedom for the thumb. The servo in the fingers turns clockwise to control the inward motion of the finger i.e. towards the palm of the arm. A retraction mechanism is used to retract the finger back to its original position i.e. away from palm after the servo turns anticlockwise. The second servo on the thumb is used to control the position of the thumb to change the angle at which the thumb interacts with the object. Fig -8: Prosthetic Arm fingers movement outwards 3. CONCLUSIONS Human robot interaction (HRI) has a wide range of applications; prosthetics is one of them. In this paper, we have taken up a medically inclined issue and solved it. This study involves intricate mechanical design of the arm as well as electronic control. A low-cost, yet functional prosthetic arm was designed and tested to take human vocal commands as inputs. Alternatively, when voice isn t audible, it can take remote control inputs as well. The arm can potentially do all quintessential tasks done by a human arm on dailybasis. 4.FUTURE SCOPE As for enhancement of our design, we can use our own neural network model for recognizing human voice commands instead of Google speech-to-text API which requires internet connection. Currently we coded a few basic commands like Grab, Release, Wave, move forward, move backward. The number of commands can be increased based on the requirement and the memory of the embedded system, for instance tieing shoe laces can eventually be coded. This technique can also be implemented on various other wearable aids (like leg or walking aids) or different fields as well. ACKNOWLEDGEMENT Fig -7: Prosthetic Arm fingers movement inwards We would like to express our heartfelt gratitude to the Department of Electronics and Communication Engineering, PES University for the guidance and support provided. 2018, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 476

REFERENCES [1] S. Mohamed Sulaiman, M. Anto Bennet, P.L. Annamalai, E. Ramki and M. Mohamed tazudeen, Voice Control Prosthetic Arm Middle-East Journal of Scientific Research. [2] R. ASWINBALAJI, A. ARUNRAJA, Wireless Voice Controlled Robotics Arm International Journal of Emerging Technology in Computer Science & Electronics. [3] Brandi House, JonathanMalkin, JeffBilmes, The VoiceBot: A Voice Controlled Robot Arm. [4] Abhinav Salim, Ananthraj C R, Prajin Salprakash, Babu Thomas, Voice Controlled Robotic Arm, IRJET. [5] Young June Shin, Soohyun Kim, Kyung-Soo Kim, Design of Prosthetic Robot Hand with High Performances Based on Novel Actuation Principles 6th IFAC Symposium on Mechatronic Systems. 2018, IRJET Impact Factor value: 7.211 ISO 9001:2008 Certified Journal Page 477