Intelligent Assistant System

Similar documents
Eye Monitored Wheelchair System Using Raspberry Pi

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

Review on Eye Movement Controlled Wheelchair

Controlling Obstacle Avoiding And Live Streaming Robot Using Chronos Watch

Smart Phone Based Assistant System for Handicapped/Disable/Aged People

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

Total Hours Registration through Website or for further details please visit (Refer Upcoming Events Section)

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

ARTIFICIAL ROBOT NAVIGATION BASED ON GESTURE AND SPEECH RECOGNITION

Voice Controlled Intelligent Wheelchair using Raspberry Pi

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Voice Guided Military Robot for Defence Application

Implementation Of Vision-Based Landing Target Detection For VTOL UAV Using Raspberry Pi

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS

Implementation of Number Plate Extraction for Security System using Raspberry Pi Processor

RESEARCH AND DEVELOPMENT OF DSP-BASED FACE RECOGNITION SYSTEM FOR ROBOTIC REHABILITATION NURSING BEDS

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

Pathbreaking robots for pathbreaking research. Introducing. KINOVA Gen3 Ultra lightweight robot. kinovarobotics.com 1

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

OPEN CV BASED AUTONOMOUS RC-CAR

EYE CONTROLLED WHEELCHAIR

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Moving Object Follower

3D ULTRASONIC STICK FOR BLIND

Implementation of a Self-Driven Robot for Remote Surveillance

Cortex-M3 based Prepaid System with Electricity Theft Control

War Field Spying Robot With Night Vision Camera

Design and Development of Pre-paid electricity billing using Raspberry Pi2

ROBOTIC ARM FOR OBJECT SORTING BASED ON COLOR

An IoT Based Real-Time Environmental Monitoring System Using Arduino and Cloud Service

INTELLIGENT SELF-PARKING CHAIR

AUTOMATIC RESISTOR COLOUR CODING DETECTION & ALLOCATION

Developing a Computer Vision System for Autonomous Rover Navigation

Voice based Control Signal Generation for Intelligent Patient Vehicle

The Design of Intelligent Wheelchair Based on MSP430

Senior Design I. Fast Acquisition and Real-time Tracking Vehicle. University of Central Florida

CONTACT: , ROBOTIC BASED PROJECTS

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

Pick and Place Robotic Arm Using Arduino

Implementation of Smart Home System Using Wireless Technology

Obstacle Avoiding Robot

Sri Shakthi Institute of Engg and Technology, Coimbatore, TN, India.

III. MATERIAL AND COMPONENTS USED

International Journal of Advance Engineering and Research Development

Vehicle Detection, Tracking and Counting Objects For Traffic Surveillance System Using Raspberry-Pi

Design of High-Precision Infrared Multi-Touch Screen Based on the EFM32

Training Schedule. Robotic System Design using Arduino Platform

Arduino Based Robot for Pick and Place Application

[Kathar*, 5(2): February, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

SMART ADAPTIVE VEHICLE LIGHTING SYSTEM Ch Prithviraj 1, K. Archana Bhange 2 1

Get your daily health check in the car

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

Four Quadrant Speed Control of DC Motor with the Help of AT89S52 Microcontroller

A New Approach to Control a Robot using Android Phone and Colour Detection Technique

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Gesticulation Based Smart Surface with Enhanced Biometric Security Using Raspberry Pi

Implementation of License Plate Recognition System in ARM Cortex A8 Board

Controlling Robot through SMS with Acknowledging facility

DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS

Computerization of Wheel Chair Using Patient Iris and Arduino Board

Development of a telepresence agent

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Automated Parking Management System using Image Processing Techniques

PN7150 Raspberry Pi SBC Kit Quick Start Guide

Electronic Travel Aid Based on. Consumer Depth Devices to Avoid Moving Objects

Controlling Humanoid Robot Using Head Movements

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

Solar Mobius Final Report. Team 1821 Members: Advisor. Sponsor

detection is done using Open CV on to the Raspberry Pi 3.

IoT Based Monitoring of Industrial Safety Measures

RF Controlled Smart Hover Board

UNIT 4 VOCABULARY SKILLS WORK FUNCTIONS QUIZ. A detailed explanation about Arduino. What is Arduino? Listening

DTMF based Surveillance Robot

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

Sensor Based Train Collision Identification and Avoidance System

Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering. (An ISO 3297: 2007 Certified Organization)

EF-45 Iris Recognition System

Substitute eyes for Blind using Android

Development of Indian Coin based automatic shoe Polishing Machine using Raspberry pi with Open CV

GESTURE BASED HOME AUTOMATION SYSTEM USING SPARTAN 3A, ASIC

Devastator Tank Mobile Platform with Edison SKU:ROB0125

AUTOMATIC INSPECTION SYSTEM FOR CMOS CAMERA DEFECT. Byoung-Wook Choi*, Kuk Won Ko**, Kyoung-Chul Koh***, Bok Shin Ahn****

INTELLIGENT SEGREGATION SYSTEM

Path Following and Obstacle Avoidance Fuzzy Controller for Mobile Indoor Robots

CEEN Bot Lab Design A SENIOR THESIS PROPOSAL

Proseminar Roboter und Aktivmedien. Outline of today s lecture. Acknowledgments. Educational robots achievements and challenging

Programming of Embedded Systems Uppsala University Spring 2014 Summary of Pan and Tilt project

Weekly report: January 11 - January 18, 2018

INTELLIGENT KITCHEN MODEL FOR SMART HOMES

Fernando Ribeiro, Gil Lopes, Davide Oliveira, Fátima Gonçalves, Júlio

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

IoT using Raspberry Pi

A Smart walking stick for visually impaired using Raspberry pi

Smart Car: Collision Avoidance. Ajeena Kurian Mike Krause George Kachouh

IOT Based Secure System For Monitoring And Control of Coal Mine Environment of Robotics.

Transcription:

IJSRD - International Journal for Scientific Research & Development Vol. 4, Issue 09, 2016 ISSN (online): 2321-0613 Intelligent Assistant System Patel Harsh 1 Shah Daivik 2 Pawar Pranay 3 Vora Rudri 4 Mr.Minkal Patel 5 1,2,3,4 U.G. Student 5 Professor 1,2,3,4,5 Department of Electronics & Communication engineering 1,2,3,4,5 Silver Oak College of Engineering and Technology, Ahmedabad. Abstract The INTELLIGENT ASSISTANT SYSTEM helps the people who cannot walk due to physiological or physical illness, injury or any disability. Recent development promises a wide scope in developing smart wheelchairs. Our project can operate through various modes like:- 1) By using user voice commands 2) By using hand gestures 3) By using optical type (eye tracking). The target of this project is to encourage the development of individuals who are impaired or disabled and elderly individuals who are not ready to move well. The aftereffect of this outline will permit certain individuals to carry on with an existence with less reliance on others. The proposed model is a conceivable option. With a specific end goal to help physically crippled people, we built up a voice controlled wheelchair. The framework is intended to control a wheelchair utilizing the voice of user. Discourse acknowledgement innovation is a key which may give another method for human association with machines or apparatuses. Another technique to guide and control the wheelchair for incapacitated individuals in view of 'Human- Computer Interaction (HCI), it utilizes Microelectromechanical frameworks (MEMS) innovation by that a client can control its wheelchair utilizing hand motions. In this project, we utilize the optical-sort eye following system to control a controlled wheel chair. User s eye development is meant screen position utilizing the optical sort eye following system. At the point when user takes a gander at suitable edge, then PC input system will send charge to the product in light of the edge of turn of pupil i.e., (push ahead), (move left), (move right) in all different cases wheel seat will stop. Once the image has been handled it moves onto the second part, our chip will respond as indicated by its yield. Eye Recognition: In this system controlling of wheelchair is rely on upon eye developments and central switch. Camera is mounted on wheelchair before the individual, for catch the image of eye and tracks the position of eye understudy by utilizing some image preparing methods. As per eye pupil position of user, motors will be move in required heading, for example, left, right and forward. Ultrasonic sensor is mounted in front of wheelchair for safety to detect static or mobile barriers and stop the wheelchair movement automatically. A central switch is also mounted on wheelchair for emergency purpose and stop to move in require direction if any one call to stop and someone require attention on themselves. This is independent and cost effective wheelchair system. A raspberry pi board is used to control whole system. Key words: Eye Detection, Wheelchair, Raspberry pi, Physically Handicapped I. INTRODUCTION Eye movement controlled wheelchair is to empower totally deadened patient and elderly to make their life more available. Versatility has turned out to be vital for a better life style. Loss of mobility of a people because of damage is typically joined by lost self-assurance. Outlining a system with autonomous versatility for such incapacitated individuals is our point in this project. Statistics show us that 43 million are disabled, about 17% of 250 million; almost 1 out of 5 persons are disabled. 52% of spinal cord injured individuals are considered paraplegic and 47% quadriplegic. Quadriplegics are limited in their motion and need some device to communicate with their wheel chair for mobility without others assistance. Individual who can't walk and are utilizing wheelchair apply incredible measure of vitality utilizing physical quality to turn the wheels. Impaired would spare vitality and could utilize their hand and arm for different exercises. As of now there are distinctive eye based strategy will be use for controlling wheelchair, for example, EOG, ECG, EEG based, Eyeball detecting technique. To choose the area of eye student relies on upon voltage variety. Be that as it may, for various yield voltage will be produces for various client, which gives wrong area of the eye understudy. Voice activated power wheelchair system, which works properly when user speak the command clearly, according to it left, right, forward, back and stop. Other voices which come from surrounding user may affect the system. The head movement based system and chin control based system, bad movement gives problem [1][2]. Sip and Puff wheelchair system, not good for people with weak breathing. Infrared reflection based eye understudy identification framework give exact recognition of eye student focus area. Eye controlled wheelchair framework is presented utilizing camera for catch the picture. But someone required attention on itself, so here we use central switch to stop. Camera caught the picture progressively in view of face, eye and eye pupil location with least deferral of time and break down the picture as contribution to set the summons to interface the engine driver IC through sending the order to GPIO pins, to play out the distinctive operation, for example, left, right, forward and stop. Image processing open computer vision (OPEN CV) library is used for face and eye detection [3]. System includes multistage that is track the eye pupil centre [4]. To detect the single or multiple face and detection of both Eyes, this is ultimate goal of this system. Several Algorithms are used to find exact pupil location direction. HAAR CASCADE like feature detection All rights reserved by www.ijsrd.com 14

algorithm is used [5]. Picture preparing are incorporates confront face detection, eye detection, color image to gray conversion, blurring, edge detection, pattern matching, filtering, noise reduction, etc. Figure 1 demonstrates the design outline of the system. The reason for this eye controlled wheelchair is to wipe out the help required for the impaired individual. Fig. 1: System Architecture Diagram. Raspberry pi board is soul of the system, which control the total system operation. Image processing based information signal sent to the raspberry pi, raspberry pi got the information and analyze it and send the control signal to motor driving circuit, based on the location of eye pupil. This will choose motors run either in clockwise or anticlockwise heading or stop. Two individual motor are altered on every wheel. Ultrasonic sensor is mounted on the wheelchair for obstruction recognition. On the off chance that sensor gets any obstruction near the wheelchair, it shows to the raspberry pi and it will sends the signal to motor driving circuit to stop the motors. II. HARDWARE DESIGN MODEL In this system there is mandatory to gives the power supply to individual components and standard power supply should be used for raspberry pi, camera, sensor, motors and switch. The figure 2 shows the functionality of the system. Fig. 2: Design Model. In proposed system component like LCD, camera, power circuit and Wi-Fi adapter directly connected through the raspberry pi board and raspberry pi board is connected to the web server for remote access [13], in case of emergency controlling or check the status of wheelchair. A. Hardware Description: 1) Raspberry pi board: Raspberry pi board is brain of the system. Raspberry pi board have its own operating system is known as raspbian which is Linux based operating system and compatible with raspberry pi board. A real time data receive and determine the digital data by raspberry pi B+ model board, which is very efficiently work with the multiple images. Raspberry pi sends the command to motor driver which is enabling the GPIO pin to raspberry pi. 2) Web camera: Web camera is used for capturing the image. We can also use HD (high definition) camera but it increase the memory size, system can t able to read the image and it will increase the processing time. UV4L driver is needed for interfacing a camera with raspberry pi board. 3) Ultrasonic sensor: Ultrasonic sensor is used to detect obstacle in the path of wheelchair. Sensor is directly connected to the raspberry pi board. It receives the data and measuring the distance between wheelchair and obstacle. If any obstacle is detected very close to wheelchair, motors will stop to run the wheels. Ultrasonic sensor is a very affordable proximity / distance sensor that has been used mainly for object avoidance. 4) Motor: Two 12 v DC motor is used in project to demonstrate running of wheelchair in forward, reverse, left and right direction. L293D motor driver is used to interface with raspberry pi which is TTL compatible. Two H bridges of L293D can be connected in parallel to increase its current capacity to 2 Amp. 5) LCD (Liquid crystal display): LCD is very helpful in providing user interface as well as for debugging purpose. Most common LCD controller is HITACHI 44780 which provides a simple interface between the controller and LCD. LCD is used as a monitor in most of the electronic project. 6) Relay: Relay is an electromechanical device and basically consists of an electromagnet and a number of contact sets. Relay is used in this system to change the direction of motor very fast without using the finger. B. Software Description: 1) Putty software: Putty is a free and open source terminal emulator and network file transfer application. Putty software is used to connect the desktop to raspberry pi board. 2) OpenCV image library: OpenCV is released under a BSD license and free for both academic and commercial use. It has C++, C, Python and Java interfaces and supports Windows, Linux, Mac OS, ios and Android. OpenCV was designed for computational efficiency and with a strong focus on real time application. 3) Python language: Python is a translator, question arranged, abnormal state programming dialect with flow semantics. Its abnormal state worked in information structures, consolidated with element All rights reserved by www.ijsrd.com 15

writing and element authoritative; make it exceptionally appealing for quick application advancement. Python's anything but difficult to learn linguistic structure accentuates lucidness and thusly decreases the cost of program upkeep. The quick alter test debug cycle makes this exceptionally successful. Besides Matlab is utilized for coding however it is very costly and calculations are exclusive, math work puts limitation on code conveniences. C. Design Methods: Calculation utilized for recognizing the eye understudy area by picture handling. A few phases used to decide eye development, for example, face and eye identification, shading to dark transformation, vigilant edge recognition, Hough circle change and eye following. At first the framework got the caught pictures by USB web camera. Our first means to recognize the client confront precisely. In the event that there are various confronts, it will demonstrate the blunder. Framework demonstrates the substance of client in a particular territory of picture. After that it will play out the few operation of picture handling to track the eye development. Figure 3 demonstrates the proposed approach of executed framework. For edge detection canny edge detection is used. For circle detection Hough circle transform method is used. Image processing based on opencv library is installed in raspberry pi memory. Figure 4 shows the flow chart of system Fig. 4: Conditional Flow Diagram. Fig. 3: Flow Method. Camera will begin to catch the pictures. For the face and eye identification Haar course calculation is utilized. After recognition of face, it will attempt to distinguish the eye inside the face and draw the rectangular box over the eye. To distinguish the eye understudy and characterize its inside focuses is extreme objective of the framework by a few pictures handling procedure. The framework will trim the eye district of intrigue and it will draw the all conceivable hover on that specific territory to recognize the eyeball. Then we applied corner detection method to detect the corner. Average of these two points indicates the centre point. To measure the distance between the centre point and eye circle centre point using coordinates system. Minimum distance indicates the eye pupil in left and maximum value indicates the eye pupil presented in right. If there is no movement of eye it will indicates eye is in middle position [6]. When eye move in left, left side motor will run and when eye move in right, right side motor will run. If eye will be in centre, both motors moved and wheelchair moving in forward direction. If any obstacle is detected, system will be stop and move either in left or right direction according to eye movement. If someone calls the user, user will stop the wheelchair by emergency switch. Eye blinking logic will be decide the start and stop operation [7]. III. IMPLEMENTATION AND SYSTEM DESCRIPTION The low power consumption credit card sized raspberry pi B+ board is used, which have inbuilt 40 pin GPIO, 4 USB ports, UART, PWM, HDMI port and Ethernet adapter port for internet connection via wireless or wired connection. Raspberry pi have a 512 MB RAM and capable of up to 32 GB external memory, controlled based on ARM architecture. Camera is directly connected with raspberry pi board and continuously captures the images, distance between eye and camera device is fixed. It may be 10 to 15 cm. Face and eye detected by Haar cascade algorithm and find out the exact pupil location. Then several algorithms used to measure the centre point from the average of both corners of the eye. This gives the correct information eye movements. The motor driving circuit is connected with raspberry, battery for power supply of motors, relay for controlling the motor driving IC. System continuously generates the directive signal to enable the GPIO pins and perform the required operation like left, right, forward and stop. Central switch is also connected with system for emergency purpose. Ultrasonic sensor for obstacle detection. IV. SYSTEM INSTALLATION To boot a train image in micro SD card win32 disk imager software used. Then putting a bootable memory device on All rights reserved by www.ijsrd.com 16

raspberry pi board and admit the raspbian operating system directly without resetting [19]. V. EYE TRACKING To track the eye movement we use coordinate system. Which decide the eye centre point location [8][9]. Figure 6 indicates the eye pupil location using coordinate system graph. Where A1 and A2 is corner point of eye pupil in X direction, B1 and B2 is corner point of eye pupil in Y direction. The X and Y calibration point represent the direction of eye movements [10].The eyeball position at the (A0, B0) a point is: A0 = (A1+A2) / 2.. (1.1) B0 = (B1+B2) / 2.. (1.2) Fig. 7: Output of the system. VII. CONCLUSION AND FUTURE WORK Fig. 5: Pupil Coordinate Graph. VI. RESULTS The wheelchair system received the resulted data of image processing and based on eye pupil centre location. It sends the command to motor driving circuit. Then wheelchair moves in required direction according to eye movement. Ultrasonic sensor is used in this system for obstacle detection. It measure the distance between wheelchair and obstacle. When obstacle is very close to wheelchair, motors will stop the wheelchair. Central switch is used for emergency purpose to stop the wheelchair. Figure 6 shows the Output of the system. Fig. 6: Frontal & eye detection using OPEN CV. A. Conclusion: In this project we built up a wheelchair system which empowers the handicapped patient to move their wheelchair autonomously with their own. In the rel time application, we can utilise camera, crisis switch and ultrasonic sensor relies on upon their application. The wheelchair development operation with some delay time. Dull light places influence the execution of wheelchair, hard to track the eye student in dark light. B. Future work: To make the system more interact with patient we need to add some additional sensors. Delay time may be further reducing to a second. Operation of system depends on eye movement of totally paralyze patients. Thus wheelchair moves in all required direction with good response. Also, wheelchairs can opertate through voice command or using hand gesture. ACKNOWLEDGMENT The authors greatly acknowledge the support provided by Professor Mr.Minkal Patel and my family. REFERENCES [1] Lee W. O, Lee H. C, Cho C. W, Gwon S. Y, Park K. R, Lee H, Cha J (2012) Auto - focusing Method for Remote Gaze Tracking Camera. Optical Engineering. 51: 063204-1-063204-15. [2] Automation of wheelchair using ultrasonic and body kinematics, Preethika Britto, Indumathi.J, Sudesh Sivarasu, Lazar Mathew, CSIO Chandigarh, INDIA, 19-20 March 2010. [3] Matt Bailey, ET. Al, Development of Vision Based Navigation for a Robotic Wheelchair, in Proceedings of 2007 IEEE 10th International conference on rehabilitation robotics. [4] Luong D. T, Lee H. C, Park K. R, Lee H. K, Park M. S, Cha J (2010) A System for Controlling IPTV at a Distance Using Gaze Tracking. Proc. IEEE Summer Conference. 33: 37-40. All rights reserved by www.ijsrd.com 17

[5] Shafi. M, Chung. P. W. H: A Hybrid Method for Eyes Detection in Facial Images, International Journal of Electrical, Computer, and Systems Engineering, 231-236, [2009]. [6] Eye Controlled Wheelchair, Sandesh Pai, Sagar Ayare, Romil Kapadia, and October-2012. Iris Movement Tracking by Morphological Operations for Direction Control, Yash Pathak, Samir Akhare, and Vishal Lambe. September 2012. [7] D. Purwanto, R. Mardiyanto, K. Arai: Electric wheelchair control with gaze direction and eye blinking, Proceedings of The Fourteenth International Symposium on Artificial Life and Robotics, GS21-5, B-Con Plaza, Beppu, [2008]. [7] Eyeball and Blink Controlled Robot with Fuzzy Logic Based Obstacle Avoidance System for Disabled K.S.Sabarish, A.M.Suman, (ICEEE'2012) June 16-17, 2012, Bangkok. [8] Implementation of Wheelchair Controller using Eyeball Movement for Paralytic People, A. kamaraj, April -2013. [9] Implementation of Wheelchair Controller using Eyeball Movement for Paralytic People, A.kamaraj, April -2013. All rights reserved by www.ijsrd.com 18