Neural Network Classifier and Filtering for EEG Detection in Brain-Computer Interface Device

Similar documents
Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors

Application of Multi Layer Perceptron (MLP) for Shower Size Prediction

CHAPTER 6 BACK PROPAGATED ARTIFICIAL NEURAL NETWORK TRAINED ARHF

Use of Neural Networks in Testing Analog to Digital Converters

Signal Processing of Automobile Millimeter Wave Radar Base on BP Neural Network

Comparison of Various Neural Network Algorithms Used for Location Estimation in Wireless Communication

MINE 432 Industrial Automation and Robotics

Artificial Neural Network Based Fault Locator for Single Line to Ground Fault in Double Circuit Transmission Line

COMPARATIVE STUDY ON ARTIFICIAL NEURAL NETWORK ALGORITHMS

CHAPTER 4 LINK ADAPTATION USING NEURAL NETWORK

ECG QRS Enhancement Using Artificial Neural Network

Performance Comparison of Power Control Methods That Use Neural Network and Fuzzy Inference System in CDMA

NEURAL NETWORK DEMODULATOR FOR QUADRATURE AMPLITUDE MODULATION (QAM)

Prediction of Missing PMU Measurement using Artificial Neural Network

Analysis Of Feed Point Coordinates Of A Coaxial Feed Rectangular Microstrip Antenna Using Mlpffbp Artificial Neural Network

Artificial Neural Networks. Artificial Intelligence Santa Clara, 2016

Computation of Different Parameters of Triangular Patch Microstrip Antennas using a Common Neural Model

Transient stability Assessment using Artificial Neural Network Considering Fault Location

Internal Fault Classification in Transformer Windings using Combination of Discrete Wavelet Transforms and Back-propagation Neural Networks

Harmonic detection by using different artificial neural network topologies

Prediction of Influence of Doping of NaNO 3 on the Solid Phase Thermal Decomposition of Bitumen using neural networks

Multiple-Layer Networks. and. Backpropagation Algorithms

Artificial Intelligence Elman Backpropagation Computing Models for Predicting Shelf Life of. Processed Cheese

IBM SPSS Neural Networks

Prediction of Compaction Parameters of Soils using Artificial Neural Network

CHAPTER 4 MONITORING OF POWER SYSTEM VOLTAGE STABILITY THROUGH ARTIFICIAL NEURAL NETWORK TECHNIQUE

Enhanced MLP Input-Output Mapping for Degraded Pattern Recognition

NEURAL NETWORK BASED MAXIMUM POWER POINT TRACKING

FEATURES EXTRACTION TECHNIQES OF EEG SIGNAL FOR BCI APPLICATIONS

ISSN: [Jha* et al., 5(12): December, 2016] Impact Factor: 4.116

LabVIEW based Intelligent Frontal & Non- Frontal Face Recognition System

Smart Antenna Design Using Neural Networks

Classifying the Brain's Motor Activity via Deep Learning

Vibration Analysis using Extrinsic Fabry-Perot Interferometric Sensors and Neural Networks

Artificial Neural Network Approach to Mobile Location Estimation in GSM Network

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

A Robust Footprint Detection Using Color Images and Neural Networks

Color Feature Extraction of Oil Palm Fresh Fruit Bunch Image for Ripeness Classification

Fault Classification and Faulty Section Identification in Teed Transmission Circuits Using ANN

Neural Network with Median Filter for Image Noise Reduction

TCM-coded OFDM assisted by ANN in Wireless Channels

AN IMPROVED NEURAL NETWORK-BASED DECODER SCHEME FOR SYSTEMATIC CONVOLUTIONAL CODE. A Thesis by. Andrew J. Zerngast

A Neural Network Approach for the calculation of Resonant frequency of a circular microstrip antenna

Keywords : Simulated Neural Networks, Shelf Life, ANN, Elman, Self - Organizing. GJCST Classification : I.2

Approximation a One-Dimensional Functions by Using Multilayer Perceptron and Radial Basis Function Networks

Proceedings of the 6th WSEAS International Conference on Multimedia Systems & Signal Processing, Hangzhou, China, April 16-18, 2006 (pp )

Neural network pruning for feature selection Application to a P300 Brain-Computer Interface

A COMPARISON OF ARTIFICIAL NEURAL NETWORKS AND OTHER STATISTICAL METHODS FOR ROTATING MACHINE

Generating an appropriate sound for a video using WaveNet.

EEG Feature Extraction using Daubechies Wavelet and Classification using Neural Network

Recognition Offline Handwritten Hindi Digits Using Multilayer Perceptron Neural Networks

Automatic Speech Recognition (CS753)

IN recent years, industry has begun to demand higher power

Neural Networks and Antenna Arrays

MLP/BP-based MIMO DFEs for Suppressing ISI and ACI in Non-minimum Phase Channels

Neural Network based Digital Receiver for Radio Communications

IMPLEMENTATION OF NEURAL NETWORK IN ENERGY SAVING OF INDUCTION MOTOR DRIVES WITH INDIRECT VECTOR CONTROL

A Novel Fuzzy Neural Network Based Distance Relaying Scheme

Human Authentication from Brain EEG Signals using Machine Learning

An ANN-Based Model and Design of Single-Feed Cross-Slot Loaded Compact Circularly Polarized Microstrip Antenna

Systematic Treatment of Failures Using Multilayer Perceptrons

Decriminition between Magnetising Inrush from Interturn Fault Current in Transformer: Hilbert Transform Approach

Initialisation improvement in engineering feedforward ANN models.

Introduction to Machine Learning

MAGNT Research Report (ISSN ) Vol.6(1). PP , Controlling Cost and Time of Construction Projects Using Neural Network

Prediction of Breathing Patterns Using Neural Networks

Eur Ing Dr. Lei Zhang Faculty of Engineering and Applied Science University of Regina Canada

Decoding Brainwave Data using Regression

Evaluating the Performance of MLP Neural Network and GRNN in Active Cancellation of Sound Noise

Human factor and computational intelligence limitations in resilient control systems

MLP for Adaptive Postprocessing Block-Coded Images

Estimation of Ground Enhancing Compound Performance Using Artificial Neural Network

Monitoring and Detecting Health of a Single Phase Induction Motor Using Data Acquisition Interface (DAI) module with Artificial Neural Network

Application of Feed-forward Artificial Neural Networks to the Identification of Defective Analog Integrated Circuits

Analysis of Learning Paradigms and Prediction Accuracy using Artificial Neural Network Models

COMPUTATION OF RADIATION EFFICIENCY FOR A RESONANT RECTANGULAR MICROSTRIP PATCH ANTENNA USING BACKPROPAGATION MULTILAYERED PERCEPTRONS

A New Switching Controller Based Soft Computing-High Accuracy Implementation of Artificial Neural Network

Fault Detection in Double Circuit Transmission Lines Using ANN

Adaptive Multi-layer Neural Network Receiver Architectures for Pattern Classification of Respective Wavelet Images

Classification of EEG Signal for Imagined Left and Right Hand Movement for Brain Computer Interface Applications

A Compact DGS Low Pass Filter using Artificial Neural Network

Journal of Engineering Science and Technology Review 10 (4) (2017) Research Article

Neural networks are very

An Approach to Detect QRS Complex Using Backpropagation Neural Network

Efficient Learning in Cellular Simultaneous Recurrent Neural Networks - The Case of Maze Navigation Problem

FACE RECOGNITION USING NEURAL NETWORKS

Classification of Taste using a Neural Network: A Case Study in Mineral Water and Drinking Water Classification

Research Article Adaptive Forming of the Beam Pattern of Microstrip Antenna with the Use of an Artificial Neural Network

CALIFORNIA STATE UNIVERSITY, NORTHRIDGE POWER SYSTEM VOLTAGE STABILITY ANALYSIS AND ASSESSMENT USING ARTIFICIAL NEURAL NETWORK

Current Harmonic Estimation in Power Transmission Lines Using Multi-layer Perceptron Learning Strategies

Neural Model of the Spinning Process for Predicting Selected Properties of Flax/Cotton Yarn Blends

An Hybrid MLP-SVM Handwritten Digit Recognizer

Performance Improvement of Contactless Distance Sensors using Neural Network

Surveillance and Calibration Verification Using Autoassociative Neural Networks

Neural Network Based Optimal Switching Pattern Generation for Multiple Pulse Width Modulated Inverter

EEG Waves Classifier using Wavelet Transform and Fourier Transform

MODELLING OF TWIN ROTOR MIMO SYSTEM (TRMS)

Synthesis of On-Chip Square Spiral Inductors for RFIC s using Artificial Neural Network Toolbox and Particle Swarm Optimization

Student: Nizar Cherkaoui. Advisor: Dr. Chia-Ling Tsai (Computer Science Dept.) Advisor: Dr. Eric Muller (Biology Dept.)

Genetic Neural Networks - Based Strategy for Fast Voltage Control in Power Systems

Transcription:

Neural Network Classifier and Filtering for EEG Detection in Brain-Computer Interface Device Mr. CHOI NANG SO Email: cnso@excite.com Prof. J GODFREY LUCAS Email: jglucas@optusnet.com.au SCHOOL OF MECHATRONICS, COMPUTER & ELECTRICAL ENGINEERING UNIVERSITY OF WESTERN SYDNEY P.O BOX 10 KINGSWOOD, NSW 2747, AUSTRALIA ABSTRACT In this paper, a neural network classifier and two backpropagation filtering models (fast backpropagation and backpropagation- Levenberg-Marquardt ) are used to process simulated brain wave signals (Electroencephalogram, EEG),which are similar to the signals obtained from in-house developed BCI equipment. The simulated raw data sets are used to train the multi-layer perceptron (MLP) neural network which then classify the EEG wave bands. The output of the neural classifier will not only be applied to show the status of the brain, but it is also used to classify the filtered EEG signals from the backpropagation neural filtering models which are investigated in this paper. Index Terms Brain, Brain-Computer Interface (BCI), electroencephalogram (EEG), neural network, multilayer perceptrons (MLP), recognition, classifier, and backpropagation. 1. Introduction A comprehensive review of research and development into Brain Computer Interface (BCI) devices has been reported [1] which concludes that the key technical challenge is to reliably and accurately detect and classify features of an electroencephalogram (EEG). Since different BCI differ greatly in their inputs, translation s, outputs, and other characteristics, it is often difficult to make a direct comparison of results [3]. Nevertheless, in all cases, the goal is to maximize performance and practicability for the chosen application. The goal of this paper is to classify and filter EEG raw data with neural network translation s in order to obtain the best performance for the BCI prototype which has been developed. It has been shown [2] that a neural network classifier can be used to distinguish the features of an EEG. A neural network classifier accompanied with two backpropagation filtering models for processing the simulated brain wave signals (Electroencephalogram, EEG) collected from developed Brain-Computer Interface (BCI) Device prototype will be presented. The simulated raw data sets generated will be used to train the multi-layer perceptron (MLP) neural network to classify the EEG wave bands. The output of the neural classifier will not only be applied to show the status of the brain s activity, it is also used to classify the filtered EEG signals from the backpropagation neural filtering models which are investigated in this paper. 2. Methods A. Training for the Neural Classifier by MLP The neural classifier model developed is a multilayer perceptron (MLP) structure that contains two hidden layers as shown in Figure 1. There is no precise method for determining the number of neurons per hidden layer and these must be estimated [6]. If too many neurons are in the hidden layers, it is hard for the network to make generalizations. If too few neurons are in the hidden layers, it is hard for the network to encode the significant features in the input data. Following comprehensive trial and error testing, it was found that the number of neurons used in the hidden layers in this work performs more efficiently than the number of neurons in hidden layers as specified by Kolmogorov s theorem [6]. The first layer is fed with 30(R) sets of simulated EEG raw data involving delta, theta, alpha, beta and gamma bands. The numbers of neurons in the first and second hidden layer were optimised by trial and error to be 150(S1) and 3(S2). Since the output of the network will be decoded in a binary scheme in the present design, the hard-limit transfer function (f1 and f2) is selected in order to provide each neuron output with a 1 if the result (a1 or a2) is equal to or greater than 0. In any other case, it provides an output of 0. In this neural classifier model, the two hidden layers are evaluated with a hard-limit transfer function. The output (a1) of the first hidden layer will be fed to the input of the second hidden layer, the output of the second hidden layer comprises three output neurons which provides a combination of 3-bit binary values and indicates which EEG wave band has been classified. The outputs of the neural classifier

corresponding to the EEG delta, theta, alpha, beta and shown in Figure 1. However, the selected transfer gamma bands are 000, 001, 010, 011 and 110 respectively. For other unclassified bands, the output will be shown as 111. With all the simulated raw data fed into the MLP classifier, the convergence of the network model can be presented as a spectrum result or more specifically as a good indicator to show the network converging to a result within tolerance error (i.e. sum-squared network error) as shown in Figure functions in the first (f1) and second (f2) hidden layers must be differentiable [5]. Also, since most of the nonlinearly part of the input features have been solved in the first hidden layer, the second hidden layer transfer function can be simpler than the first and can provide any continuous value on the output [7]. With these criteria, the tangent-sigmoid function and pure linear function are assigned to the first (f1) Figure 2a : Result for MLP neural classifier with S1=150 and S2=3 Figure 2b : Result for MLP neural classifier with S1=140 and S2=3 2a and 2b. B. Training for the Neural Network Filtering of EEG Signals by BP There are many training s for the backpropagation BP neural network. However, at the moment, no single is universally better than the others [4]. For the present BCI prototype, two backpropagation methods are investigated which filter out the Gaussian noise with a standard deviation of 3x10-4 and a mean of 0 from the different EEG wave bands. The first BP neural network training is fast backpropagation (trainbpx). The second BP training used is Levenberg-Marquardt (trainlm). Both are feedforward s [5]. They are constructed with two hidden layer BP models as and second (f2) hidden layers. In addition to the transfer functions that exist in the same layer in each model, there are 205 (P) input data sets for which each EEG wave bands (embedded with Gaussian noise) will be input to the networks. The training efficiency of the two BP networks will be investigated and compared by their sum-square error performance and the number of epochs necessary for training to be achieved. 3. Results A. Results for Neural Classifier by MLP Training The sum-square network error indicates whether the classifier can successfully separate the EEG wave bands from each of the simulated EEG raw data sets fed into the 2-hidden layer perceptron neural

classifier. It is found that the sum-square network error can be minimised to 10-2 within 122 epochs if S1=150 and S2=3. The same convergent sum-square network error result can be obtained within 349 epochs if S1=140 and S2=3. These results are shown in Figure 2a and Figure 2b respectively. B. Results for Neural Network Filtering of EEG signals by BP Training For both BP models training results, it is clearly shown that the results for the BP- Levenberg- Marquardt (trainlm) training, Figure 4a-4e, are much faster than those for the fast backpropagation (trainbpx) training, Figure 3a-3e. The target sum-square error and learning rate are set to be 10-2 in both cases. The graphs show that noise can be filtered out from delta, theta, alpha, beta and gamma wave after 4862,7188,7713,6251 and 5540 epochs respectively by using fast BP (trainbpx) training model. In the BP- Levenberg-Marquardt (trainlm) training model, noise can be dramatically filtered out from the different EEG wave bands within just 2 epochs as shown in Figure 4. 4. Discussion It is clear that the number of epochs that are needed for a successful result is decreased by increasing the number of neurons in the first layer of the neural classifier MLP. Moreover, long training times can also be caused by the presence of an outlier input vector whose length is much larger or smaller than the other input vectors. Thus, an input vector with large elements can lead to changes in the weights and biases that take a long time for a much smaller input vector to overcome [5]. The BP-Levenberg-Marquardt has proven that its gradient descent method is at least (S x Q) times faster than the fast BP, where S is the number of output neuron and Q is the number of input or target vectors [5]. The fast BP training model (trainbpx) has two modes of convergence. When it is in momentum mode, it will be used to find the optimal solution. When it is in self-adaptive learning mode, the training time will be shortened. However, comparing with BP- Levenberg-Marquardt (trainlm) training model, the convergent speed of the fast BP is much slower, and hence more epochs are required in order to reach the same target sum square error. Marquardt training s, it is obviously preferable to choose the BP- Levenberg-Marquardt training to filter out the Gaussian noise from the EEG wave bands with the system under development. 6. Acknowledgements The author would like to thank Prof. Godfrey Lucas and Mr. Howard Fu for their support. A special thanks gives to Ms. Pamilla Gan for her spiritual support and cultivation during the paper preparation. 7. References [1] Jonathan R. Wolpaw et al., Brain-Computer Interface Technology: A Review of the First International Meeting, IEEE Trans. On Rehab. Eng., vol.8, no.2, pp 164-173, June 2000. [2] William D. Penny, Stephen J. Roberts, Eleanor A. Curran, and Maria J. Stokes et al., EEG- Based Communication: A Pattern Recognition Approach, IEEE Trans. On Rehab. Eng., vol.8, no.2, pp 214-215, June 2000. [3] Aleksander Kostov and Mark Polak et al., Parallel Man-Machine Training in Development of EEG-Based Cursor Control, IEEE Trans. On Rehab. Eng., vol.8, no.2, pp203-205, June 2000. [4] Mingui Sun et al., The Forward EEG Solutions Can be Computed Using Artificial Neural Networks, IEEE Trans. On Biomedical Eng., vol.47, no.8, pp1044-1050,august 2000. [5 ] Howard Demuth and Mark Beale et al., Neural Network Toolbox For Use with MATLAB User Guide Version 4, The MathWorks, Inc.,Available: (http://www.mathworks.com) [Online] [6] Lefteri H. Tsoukalas and Robert E. Uhrig et al., Fuzzy and Neural Approaches in Engineering, John Wiley & Sons, Inc., Chapter 7 and 8,1996. [7] Martin T. Hagan, Howard B. Demuth and Mark Beale et al., Neural Network Design, PWS Pub., 1996. Figure 3a : Noise filtering in delta wave by fast BP 5. Conclusions A MLP neural classifier model has been demonstrated with the capability to classify EEG wave bands among the simulated data sets. The output of the MLP neural classifier model can be used in biofeedback training and EEG monitoring for the future specific application development of the inhouse BCI prototype. With the comparison of the speed of the convergence between the fast BP and BP- Levenberg-

Figure 3b : Noise filtering in theta wave by fast BP Figure 3d : Noise filtering in beta wave by fast BP Figure 3c : Noise filtering in alpha wave by fast BP Figure 3e : Noise filtering in gamma wave by fast BP Figure 4a : Noise filtering in delta wave by BP-Levenberg- Marquardt Figure 4b : Noise filtering in theta wave by BP- Levenberg-Marquardt

Figure 4c : Noise filtering in alpha wave by BP-Levenberg-Marquardt Figure 4d : Noise filtering in beta wave by BP-Levenberg-Marquardt Figure 4e : Noise filtering in gamma wave by BP-Levenberg- Marquardt