I. INTRODUCTION II. SYSTEM CONFIGURATION

Similar documents
Implementation of Mind Control Robot

Voice based Control Signal Generation for Intelligent Patient Vehicle

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

WHEELCHAIR MOVEMENT CONTROL USING TONGUE DRIVEN WIRELESS ASSISTIVE TECHNOLOGY

I. INTRODUCTION MAIN BLOCKS OF ROBOT

The Making of a Kinect-based Control Car and Its Application in Engineering Education

BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS

Biomedical and Wireless Technologies for Pervasive Healthcare

Android Phone Based Assistant System for Handicapped/Disabled/Aged People

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Design of WSN for Environmental Monitoring Using IoT Application

Electronics Design Laboratory Lecture #10. ECEN 2270 Electronics Design Laboratory

Performance Analysis of Ultrasonic Mapping Device and Radar

A Courseware about Microwave Antenna Pattern

A Model Based Approach for Human Recognition and Reception by Robot

AN UNIQUE METHODOLOGY ENABLING BUS BOARD NAVIGATING SYSTEM USING WSN

II. MAIN BLOCKS OF ROBOT

A Low-Cost Li-Fi Communication Setup

WIRELESS DATA ACQUISITION SYSTEM FOR PHARMACEUTICAL AND CHEMICAL INDUSTRIES USING LOAD-CELL

DESIGN OF BIO-POTENTIAL DATA ACQUISITION SYSTEM FOR THE PHYSICALLY CHALLENGED

II. LITERATURE REVIEW

Autonomous. Chess Playing. Robot

BOAT LOCALIZATION AND WARNING SYSTEM FOR BORDER IDENTIFICATION

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals

Solar Powered Obstacle Avoiding Robot

IOT Based Smart Greenhouse Automation Using Arduino

DESCRIPTION DOCUMENT FOR WIFI SINGLE DIMMER ONE AMPERE BOARD HARDWARE REVISION 0.3

Autonomous Obstacle Avoiding and Path Following Rover

International Journal of Advance Engineering and Research Development

Available online at ScienceDirect. Procedia Computer Science 105 (2017 )

WIRELESS ELECTRONIC STETHOSCOPE USING ZIGBEE

BEYOND TOYS. Wireless sensor extension pack. Tom Frissen s

1 Introduction. 2 Embedded Electronics Primer. 2.1 The Arduino

Non Invasive Brain Computer Interface for Movement Control

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

Study of M.A.R.S. (Multifunctional Aero-drone for Remote Surveillance)

Gesture Controlled Car

Implementation of Wireless Controlled Shadow Dexterous Hand for Hazardous Environment

Training Schedule. Robotic System Design using Arduino Platform

Gesture Recognition with Real World Environment using Kinect: A Review

G3P-R232. User Manual. Release. 2.06

Feeder Protection From Over Load and Earth Fault Relay

Smart Wheelchair for Disabled Persons

Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot

International Journal of Advance Engineering and Research Development

2D Floor-Mapping Car

3D ULTRASONIC STICK FOR BLIND

B. Tech. Degree ELECTRONICS AND COMMUNICATION ENGINEERING

Voice Guided Military Robot for Defence Application

Detecting The Drowsiness Using EEG Based Power Spectrum Analysis

DEMONSTRATION OF AUTOMATIC WHEELCHAIR CONTROL BY TRACKING EYE MOVEMENT AND USING IR SENSORS

Computerization of Wheel Chair Using Patient Iris and Arduino Board

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

ISSN: [Singh* et al., 6(6): June, 2017] Impact Factor: 4.116

Fingers Bending Motion Controlled Electrical. Wheelchair by Using Flexible Bending Sensors. with Kalman filter Algorithm

Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

Accident Sensor with Google Map Locator

Arduino STEAM Academy Arduino STEM Academy Art without Engineering is dreaming. Engineering without Art is calculating. - Steven K.

DESCRIPTION DOCUMENT FOR WIFI / BT HEAVY DUTY RELAY BOARD HARDWARE REVISION 0.1

EYE CONTROLLED WHEELCHAIR

HUMAN BODY MONITORING SYSTEM USING WSN WITH GSM AND GPS

Object Detection for Collision Avoidance in ITS

DASL 120 Introduction to Microcontrollers

A Unique Home Automation System through MEMS

Eye Monitored Wheelchair System Using Raspberry Pi

III. MATERIAL AND COMPONENTS USED

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

Adaptive Modulation with Customised Core Processor

AN IMPROVED CHINESE PHONETIC MORSE CODE KEY-IN SYSTEM FOR SEVERELY DISABLED INDIVIDUALS

MULTI ROBOT COMMUNICATION AND TARGET TRACKING SYSTEM AND IMPLEMENTATION OF ROBOT USING ARDUINO

Utilize Eye Tracking Technique to Control Devices for ALS Patients

ENERGY EFFICIENT DATA COMMUNICATION SYSTEM FOR WIRELESS SENSOR NETWORK USING BINARY TO GRAY CONVERSION

1. INTRODUCTION. Road Characterization of Digital maps. A. Technical Background. B. Proposed System

Design and Implementation of Boost Converter for IoT Application

Arduino Based Intelligent Parking Assistance System

FLEXIBLE ROBOT USING AUTOMATED OBJECT SENSING AND SERVING WITH GRIPPER MECHANISM

THE technological evolution of microcontrollers and their

Optimal Driving System for Two Wheelers

VISUAL FINGER INPUT SENSING ROBOT MOTION

A Smart Device Integrated with an Android for Alerting a Person s Health Condition: Internet of Things

EEL5666C IMDL Spring 2006 Student: Andrew Joseph. *Alarm-o-bot*

ILR #1: Sensors and Motor Control Lab. Zihao (Theo) Zhang- Team A October 14, 2016 Teammates: Amit Agarwal, Harry Golash, Yihao Qian, Menghan Zhang

Intelligent Tactical Robotics

Speed Control on AC Induction Motor Using PWM Controlled Voltage Source Inverter

POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM

A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications

Dual Band Filter Assembly Manual

EXPERIMENTAL INVESTIGATION OF LASER CUTTING OF DIFFERENT TYPES OF NON MATERIALS

A Design of Switched Beam Antenna For Wireless Sensor Networks

Lesson 3: Arduino. Goals

Embedded Systems & Robotics (Winter Training Program) 6 Weeks/45 Days

Team 3 Steven Kapinos Brian Lewis Anthony Vessicchio. Clients: Dr. John Enderle Shane Davis

RASPBERRY Pi BASED IRRIGATION SYSTEM BY USING WIRELESS SENSOR NETWORK AND ZIGBEE PROTOCOL

ADVANCED EMBEDDED MONITORING SYSTEM FOR ELECTROMAGNETIC RADIATION

GREEN HOUSE USING IOT

2.0 Discussion: 2.1 Approach:

An IoT Based Real-Time Environmental Monitoring System Using Arduino and Cloud Service

6545(Print), ISSN (Online) Volume 4, Issue 2, March April (2013), IAEME & TECHNOLOGY (IJEET)

ScienceDirect. An Integrated Xbee arduino And Differential Evolution Approach for Localization in Wireless Sensor Networks

Transcription:

2018 IJSRST Volume 4 Issue 5 Print ISSN: 2395-6011 Online ISSN: 2395-602X Themed Section: Science and Technology Assistive Communication Tool for Patients with Motor Neurone Disease Dr. G. Indumathi, V. Janani, S. Sabana Jasmin Department of Electronics and Communication Engineering Mepco Schlenk Engineering College, Sivakasi, Tamil Nadu, India ABSTRACT The main aim of the system is to provide a solution for the speaking disability of the motor neurone diseases (MND) patients and also for the patients those who have been affected by one side paralysis due to stroke (CerebroVascular Accident). This system will enable them to express their thoughts to the people surrounding them. Already there are a lot of solutions exist to assist for patients suffering from MND and paralysis but they are very expensive. This proposed prototype will be cost effective. It consists of sensor to detect the cheek muscle movement, wireless transceiver unit, microcontroller and a software keyboard. Keywords: MND (Motor Neurone Disease), Paralysis, Sensor, Wireless Transceiver. I. INTRODUCTION The most important aspect of life for any individual is to communicate with people surrounding them. But for differently abled people particularly those who are suffering from MND or stroke (one side paralysis) they can not only able to speak but also not possible to do some actions to express their thoughts. Because both stroke and MND are neurodegenerative disorders, means the death of both lower and upper motor neurons. This prototype is getting the input from cheek movement of the patient by using a touch sensor. This signal is transmitted wirelessly to the virtual scanning keyboard on the personal computer. For getting the input from the patient, various methods have been proposed and used. Chung-Min Wul and et al. [1] proposed a methodology to take the EOG (Electro Oculo Graphy) signals from the patients eye movement as an input to the communication system designed. Watcharin Tangsuksant and et al. [2] developed a electrooculography based system to type words on the virtual keyboard using voltage threshold algorithm. Here EOG signals is measured using two channels with six electrodes.rui Zhang and et al. [3] proposed a system which uses EEG (Electro Encaphalo Graphy) signals from the patients to control the environment and uses laser range finder, encoder and Kinect sensor for the movement of wheel chair. Zheng Li and et al [4] proposed a system which uses three micro radar sensor mounted on a helmet to detect the users tongue and cheek movement. The array of micro radar act as proximity sensors and capture muscle movements when the patient performs the tongue gesture. In this paper is proposed, which enables them to communicate with the people. Section II explains the setup needed for experimentation purpose III explains the components used. Section IV and V explains the system developed. II. SYSTEM CONFIGURATION The system configuration is shown in Figure 1. A digital touch sensor is placed on the spectacles of the patientswhich is connected to the Arduino, an open source hardware. An RF transmitter is connected to the Arduino which will transmit the sensor value through air as the medium. An RF receiver receives the data and the Arduino will transmit the received IJSRST184533 Received : 10 March 2018 Accepted : 26 March 2018 March-April-2018 [ (4) 5 : 861-864] 861

data to the personal computer where the virtual scanning keyboard is running. In the virtual scanning keyboard, characters and some images are scanned continuously with the duration of 10 sec. When the signal is received, the virtual keyboard will display the currently selected character. After a word is formed, text to speech software which is integrated with scanning keyboard will give the speech output. The transmitter is placed on the wheelchair of the patient and the receiver side hardware and the PC (Personal Computer) will be placed at one arm distance of the patient so they can feel comfortable. Figure 1. System Configuration This proposed system helps the patient to communicate with the external world. They can their cheeks to make contact with the sensor when the desired character arrives on the virtual keyboard in the connected PC. This sensor provides a high (1) output and then the signal is transmitted to the PC so that the selected characters are displayed in the text field. Finally when the word is completed, the text to speech converter which is integrated with the virtual scanning keyboard gives the speech output. III. SYSTEM MODULES The proposed system includes the following modules 1. Arduino UNO of Transmitter side 2. Arduino UNO of Receiver side seperately 3. Touch Sensor TTP223B 4. RF Transmitter and Receiver Module 5. Personal Computer (PC) with virtual keyboard. a) ARDUINO UNO BOARD Arduino UNO is a open source hardware. The PCB design of the Arduino UNO used Surface Mount Device (SMD) technology. It has 14 digital input/output pins and 6 analog inputs. Atmega328 is an 8 bit microcontroller, used in this open source. The main microcontroller has one Universal Asynchronous Receiver Transmitter (UART) module. The pins of the UART (TX, RX) are connected to a USB UART converter circuit and also connected to pins 0 and 1 in digital header in the board. The USB UART bridge consists of an IC Atmega16UC which converts USB signal to UART signal and vice versa. b) TOUCH SENSOR TTP223B is a capacitive touch digital sensor. It consists of 3 pins. Its dimensions are 4*3*2 cm. It detects the human contact based on the change in the dielectric constant (ɛ) value which influences the total capacitance value. The operating voltage is 2.0 volts to 5.5 volts. The output voltage is 4.75Vand the output current is 27.5mA when measured with multimeter. The response time of the sensor is 220ms.The touch sensor is shown in Figure 2. 862

In this design each character and image is scanned for 10 sec. A text to speech converter is integrated with this design and after the scanning of word completed, the corresponding text is converted to speech. IV. PROPOSED SYSTEM DESIGN Figure 1. Touch Sensor c) RF MODULE The RF module is a cost effective wireless communication module for low cost applications. RF module comprises of a transmitter and receiver that operate at a radio frequency range. Its frequency range is 315MHz to 434MHz. The transmitter input voltage is 3 12V and the receiver input voltage is 5V. It works on Amplitude Shift Keying (ASK) modulation technique. d) VIRTUAL SCANNING KEYBOARD A virtual scanning keyboard was designed using java. It contains a text field to display the selected characters. A set of characters which are arranged on the basis of frequency of occurrence while forming a word so that the word can be formed in a less time and images of some basic necessities of the patients for their daily life. It also contains some special symbols like @, &, etc. Figure 6 shows the virtual scanning keyboard. The sensor is placed in such a way that it can be adjusted vertically. A plastic material is taken and at the top it is bent and fixed on the frame of the spectacles. It is placed in such a way that the bent part can be slided over the frame of the spectacles. A tapering is made inward in the plastic material so that the distance between the plastic material and the cheek is balanced. A slit is made at the tapered part of the plastic material and the sensor is placed on the slit by inserting a wire on the sensor hole. The same wire is tied on the opposite side of the slit so that the sensor is fitted on the plastic material. The slit height is greater than the sensor height so that it can slide over the slitted part. Thus the vertical adjustment can be achieved. If more adjustment is needed on the sensor, the adjustment size can be expanded. This vertical adjustment is used to reduce the distance between the sensor and the cheek. So that the patient can easily touch the sensor. The front and the left side view of the proposed sensor system module is shown in Figure 3 & Figure 4. The Figure 5 shows the sensor module placement in the frame on the spectacles. Figure 6. Virtual Scanning Keyboard Figure 2. Front View of the proposed sensor system module 962

Figure 3. Left side view of the proposed sensor system placement Figure 7. Flow diagram of Transmitter module Figure 4. Sensor module on the frame of the spectacle After sensing the cheek movement by using the sensor, the sensed value is transmitted through RF transmitter. At the receiver side, the received value is given to the created java GUI by UART serial communication. If the received value is High (1) the current character is displayed in the text field. If the word is completely framed, it is converted into speech by the text to speech converter which is integrated with the GUI. Figure 7 & Figure 8 shows the flow diagram to explain the working of transmitter and receiver module respectively. Figure 8. Flow diagram of Receiver module V. CONCLUSION In this paper, we present an assistive communication tool for patients. Even though they have speech disorder and neurodegenerative problem, they can be 963

able to communicate with people around them. Since this system is cost effective even a common man can easily afford this. Further the work can be extended to reduce the scanning time. Also the work may be focussed towards to obtain the input from the patient in an effective manner to improve the accuracy. VI. REFERENCES [1]. Chung-Min Wul, Chueh-Yu Chuang I, Ming- Che Hsieh2: "An Eyes Text Input Device for Persons with the Motor Neurone Diseases". Second International Conference on Biomedical Engineering and Technology, 34. [2]. Watcharin Tangsuksant, Chittaphon Aekmunkhongpaisal: "Directional Eye Movement Detection System for Virtual Keyboard Controller". Biomedical Engineering International Conference-2012. [3]. Rui Zhang, Qihong Wang, Kai Li et al.: "A BCI- Based Environmental Control System For Patients With Severe Spinal Cord Injuries". IEEE Transaction on Biomedical Engineering, VOL 64, No 8, August 2017. [4]. Zheng Li, Ryan Robucci, Nilanjan Banerjee: "Tongue-n-Cheek : Non Contact Tongue gesture Recognition". IPSN 15, April 13-17, 2015, Seattle, WA, USA. [5]. Ameer Mohammed, Majid Zamani: "Towards On-Demand Deep Brain Stimulation using online Parkinson s Disease prediction driven by dynamic detection". IEEE transactions on Neural Systems and Rehabilitation Engineering- 2017. [6]. Amira Salkar, Seema Helawar, Shaine Mendes: "Wireless Communication using RF Module". International Journal of Science and Technology and Engineering (IJSTE)-Jan 2017. [7]. Nalini A, Thennarasu K: "Clinical characteristics and Survival pattern of 1153 patients with ALS". Journal of Neurological Science-Jan 2008. [8]. Pawan K Gupta, Sudesh Prabhakar, Suresh Sharma: "Vascular endothelialgrowth factor- A(VEGA-A) and chemokine ligand-2 (CCL2) in ALS patients". Journal of Neuroinflammation- 2011 [9]. Mandaville Gourie-Devi, Reema Gupta, Vibhor Sharma: "An insight into death wish among patients with ALS in India using Wish-to-Die- Questionnaire". Journal of Neurol India 2017. [10]. Pradipta Biswas, CambridgeDebasis Samanta: "Friend: A communication aid for persons with disabilities". IEEE transactions on neural systems and rehabilitation Engineering VOL.16, No. 2. [11]. Cheng-san Yang, Cheng-hui Yang, Li-Yeh Chuang: "A wireless internet interface for person with physical disability". 2008 IET 4th International Conference on Intelligent Environments. [12]. Al-Chalabi, Ammar, Pearce Neil: "Mapping the Human Exposome Without It, How Can We Find Environmental Risk factors for ALS". Epidemiology-Nov 2015. 964