Approximation a One-Dimensional Functions by Using Multilayer Perceptron and Radial Basis Function Networks

Similar documents
A Quantitative Comparison of Different MLP Activation Functions in Classification

MINE 432 Industrial Automation and Robotics

Application of Multi Layer Perceptron (MLP) for Shower Size Prediction

Use of Neural Networks in Testing Analog to Digital Converters

IBM SPSS Neural Networks

Artificial Neural Networks. Artificial Intelligence Santa Clara, 2016

Multiple-Layer Networks. and. Backpropagation Algorithms

A COMPARISON OF ARTIFICIAL NEURAL NETWORKS AND OTHER STATISTICAL METHODS FOR ROTATING MACHINE

An Hybrid MLP-SVM Handwritten Digit Recognizer

Neural Network Classifier and Filtering for EEG Detection in Brain-Computer Interface Device

Neural Filters: MLP VIS-A-VIS RBF Network

COMPARATIVE ANALYSIS OF ACCURACY ON MISSING DATA USING MLP AND RBF METHOD V.B. Kamble 1, S.N. Deshmukh 2 1

Enhanced MLP Input-Output Mapping for Degraded Pattern Recognition

Transactions on Information and Communications Technologies vol 1, 1993 WIT Press, ISSN

Transient stability Assessment using Artificial Neural Network Considering Fault Location

Statistical Tests: More Complicated Discriminants

Radial Basis Function to Predict the Maximum Surface Settlement Caused by EPB Shield Tunneling

NEURAL NETWORK BASED MAXIMUM POWER POINT TRACKING

Systolic modular VLSI Architecture for Multi-Model Neural Network Implementation +

Introduction to Machine Learning

Current Harmonic Estimation in Power Transmission Lines Using Multi-layer Perceptron Learning Strategies

Artificial Neural Network Engine: Parallel and Parameterized Architecture Implemented in FPGA

ARTIFICIAL NEURAL NETWORK IN THE DESIGN OF RECTANGULAR MICROSTRIP ANTENNA

Lake Level Prediction Using Artificial Neural Network with Adaptive Activation Function

On the Application of Artificial Neural Network in Analyzing and Studying Daily Loads of Jordan Power System Plant

Advances in Intelligent Systems Research, volume 136 4th International Conference on Sensors, Mechatronics and Automation (ICSMA 2016)

A Neural Network Approach for the calculation of Resonant frequency of a circular microstrip antenna

COMPARATIVE STUDY ON ARTIFICIAL NEURAL NETWORK ALGORITHMS

Comparison of MLP and RBF neural networks for Prediction of ECG Signals

Design Band Pass FIR Digital Filter for Cut off Frequency Calculation Using Artificial Neural Network

Project selection: artificial neural network approach

Comparison of Various Neural Network Algorithms Used for Location Estimation in Wireless Communication

COMPUTATION OF RADIATION EFFICIENCY FOR A RESONANT RECTANGULAR MICROSTRIP PATCH ANTENNA USING BACKPROPAGATION MULTILAYERED PERCEPTRONS

NEURAL NETWORK DEMODULATOR FOR QUADRATURE AMPLITUDE MODULATION (QAM)

Characterization of LF and LMA signal of Wire Rope Tester

USING EMBEDDED PROCESSORS IN HARDWARE MODELS OF ARTIFICIAL NEURAL NETWORKS

Modeling the Drain Current of a PHEMT using the Artificial Neural Networks and a Taylor Series Expansion

CHAPTER 4 LINK ADAPTATION USING NEURAL NETWORK

A Simple Design and Implementation of Reconfigurable Neural Networks

Analysis of Learning Paradigms and Prediction Accuracy using Artificial Neural Network Models

Image Manipulation Detection using Convolutional Neural Network

Initialisation improvement in engineering feedforward ANN models.

Harmonic detection by using different artificial neural network topologies

Stock Price Prediction Using Multilayer Perceptron Neural Network by Monitoring Frog Leaping Algorithm

Background Pixel Classification for Motion Detection in Video Image Sequences

Detection and Classification of Power Quality Event using Discrete Wavelet Transform and Support Vector Machine

Development and Comparison of Artificial Neural Network Techniques for Mobile Network Field Strength Prediction across the Jos- Plateau, Nigeria

Back Propagation Algorithm: The Best Algorithm Among the Multi-layer Perceptron Algorithm

Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors

1 Introduction. w k x k (1.1)

Research on MPPT Control Algorithm of Flexible Amorphous Silicon. Photovoltaic Power Generation System Based on BP Neural Network

Prediction of Rock Fragmentation in Open Pit Mines, using Neural Network Analysis

NEURO-ACTIVE NOISE CONTROL USING A DECOUPLED LINEAIUNONLINEAR SYSTEM APPROACH

Efficient Learning in Cellular Simultaneous Recurrent Neural Networks - The Case of Maze Navigation Problem

Application Research on BP Neural Network PID Control of the Belt Conveyor

Artificial neural networks in forecasting tourists flow, an intelligent technique to help the economic development of tourism in Albania.

AN IMPROVED NEURAL NETWORK-BASED DECODER SCHEME FOR SYSTEMATIC CONVOLUTIONAL CODE. A Thesis by. Andrew J. Zerngast

CONSTRUCTION COST PREDICTION USING NEURAL NETWORKS

Target Classification in Forward Scattering Radar in Noisy Environment

Prediction of airblast loads in complex environments using artificial neural networks

MURDOCH RESEARCH REPOSITORY

Prediction of Breathing Patterns Using Neural Networks

A Radial Basis Function Network for Adaptive Channel Equalization in Coherent Optical OFDM Systems

Neural Model for Path Loss Prediction in Suburban Environment

Constant False Alarm Rate Detection of Radar Signals with Artificial Neural Networks

Surveillance and Calibration Verification Using Autoassociative Neural Networks

Performance Comparison of Power Control Methods That Use Neural Network and Fuzzy Inference System in CDMA

Comparative Study of Neural Networks for Face Recognition

Mapping EDFA Noise Figure and Gain Flatness Over the Power Mask Using Neural Networks

Selecting Input Factors for Clusters of Gaussian Radial Basis Function Networks to Improve Market Clearing Price Prediction

Replacing Fuzzy Systems with Neural Networks

Number Plate Detection with a Multi-Convolutional Neural Network Approach with Optical Character Recognition for Mobile Devices

Heterogeneous transfer functionsmultilayer Perceptron (MLP) for meteorological time series forecasting

Multi-User Blood Alcohol Content Estimation in a Realistic Simulator using Artificial Neural Networks and Support Vector Machines

10mW CMOS Retina and Classifier for Handheld, 1000Images/s Optical Character Recognition System

Decriminition between Magnetising Inrush from Interturn Fault Current in Transformer: Hilbert Transform Approach

Behavior Emergence in Autonomous Robot Control by Means of Feedforward and Recurrent Neural Networks

Contents 1 Introduction Optical Character Recognition Systems Soft Computing Techniques for Optical Character Recognition Systems

Generating an appropriate sound for a video using WaveNet.

Application of Deep Learning in Software Security Detection

J. C. Brégains (Student Member, IEEE), and F. Ares (Senior Member, IEEE).

Improvement of Classical Wavelet Network over ANN in Image Compression

POLITEHNICA UNIVERSITY TIMISOARA

Lesson 08. Convolutional Neural Network. Ing. Marek Hrúz, Ph.D. Katedra Kybernetiky Fakulta aplikovaných věd Západočeská univerzita v Plzni.

ARTIFICIAL NEURAL NETWORK BASED CLASSIFICATION FOR MONOBLOCK CENTRIFUGAL PUMP USING WAVELET ANALYSIS

Automatic Speech Recognition (CS753)

A New Localization Algorithm Based on Taylor Series Expansion for NLOS Environment

A Multilayer Artificial Neural Network for Target Identification Using Radar Information

A study on the ability of Support Vector Regression and Neural Networks to Forecast Basic Time Series Patterns

GENERATION OF TANGENT HYPERBOLIC SIGMOID FUNCTION FOR MICROCONTROLLER BASED DIGITAL IMPLEMENTATIONS OF NEURAL NETWORKS

A 5 GHz LNA Design Using Neural Smith Chart

Application of Artificial Neural Networks System for Synthesis of Phased Cylindrical Arc Antenna Arrays

Recognition Offline Handwritten Hindi Digits Using Multilayer Perceptron Neural Networks

this flexiblee platform

Practical Comparison of Results of Statistic Regression Analysis and Neural Network Regression Analysis

VOLTAGE PROFILE ESTIMATION AND REACTIVE POWER CONTROL OF DISTRIBUTION FEEDERS DAVID TIMOTHY CHESSMORE

LV Self Balancing Distribution Network Reconfiguration for Minimum Losses

VIBRATION BASED DIAGNOSTIC OF STEAM TURBINE FAULTS USING EXTREME LEARNING MACHINE

Image Recognition for PCB Soldering Platform Controlled by Embedded Microchip Based on Hopfield Neural Network

Available online at ScienceDirect. Procedia Computer Science 85 (2016 )

Transcription:

Approximation a One-Dimensional Functions by Using Multilayer Perceptron and Radial Basis Function Networks Huda Dheyauldeen Najeeb Department of public relations College of Media, University of Al Iraqia, Baghdad, Iraq Abstract- We present two methods of approximation a one-dimensional functions, first, by using Multilayer Perceptron (MLP) and second, by using Radial Basis Function networks (RBF). The two methods of Multilayer Perceptron and Radial Basis Function networks of feedforward network type are the same but they are different in relative to input and output. However, the Radial Basis Functions neural networks (RBF) compared to the MLP have an advantage that their training is much less computational powerful. In this paper, we calculated the number of training cycles which is needed to approximate a one-dimensional function by using these two methods. It was found that RBF has a faster training process than MLP it reaches to 65 times as well as the accuracy of RBF, as it is more accurate in spite of using more than one hidden layer in MLP therefore we conclude that the method of MLP is not in appropriate for making the approximate function thus we have to use the RBF method instead of it to approximate a one-dimensional functions. Keywords: Multilayer Perceptron, Radial Basis Function networks, Exact interpolation, Gaussian Functions. I.INTRODUCTION The approximation of functions which are dealing with artificial neural networks. The theme of the approximation problem needed to be explained are the inputs and outputs, the minute details of the approximation had been selected and graded the top results which the inputs and outputs depend up on it. Multilayer Perceptron (MLP) and Radial Basis Functions neural networks (RBF) used more in common in artificial neural networks (Werbos [1] and Rumelhart [2]). Structurally, Radial Basis Functions neural network (RBF) has only one hidden layer reverse Multilayer Perceptron (MLP) has one or more hidden layers and the RBF network has a simpler structure and a faster training process than MLP [3],but how much faster, in other words, how much number of training cycles at the very least to an approximate a one-dimensional function using RBF compared to MLP to approximate the same function. It is precision hand: Are we going to get a better result if we use more than one hidden layer in MLP. Therefore we have selected one-dimensional function and approximated it using Multilayer Perceptron and Radial-Basis Functions, then analyze and compare the results. A. Multilayer Perceptron networks II. PROPOSED ALGORITHM MLP is an abbreviation of the term "Multilayer Perceptron ". It can be approximated well when putting big enough hidden layer with suitable coefficient values.[4] Their main features are (see figure.1): a) It's a Feedforward network type. b) Able to learning surfaces decision of not linear. c) Based on sigmoid unit with differentiable threshold function. (1) MLP structure and design 1 (2) 0 (3) Volume 8 Issue 1 February 2017 64

Despite of the emergence of different types of neural network models since 1940, but the Multilayer Perceptron (MLP) is still in use updatable (Mata, 2011).This network contains three layers called: input layer, output layer, and hidden layer is between them, every layer contain one or more neurons. [5] Figure.1.Representation of a MLP If this network has two neurons in the hidden layer (as shown in figure.2) the output of the first neuron is and the output of the second neuron is (5) (4) Then the total output of the neural network as: (6) Figure.2.Representation of two neurons in MLP The goal is to map an input vector x into an output y(x). We can describe the layers as: a)the input layer: the data vector or pattern is accepted by the input layer. b)the hidden layers: from the previous layer the output are accepted, then weight them, ordinarily, not linear activation function. c) The output layer: The output is taken from the final hidden layer and possibly passes through an output not linearity, to get the required values.[6] B. Radial Basis Function networks Volume 8 Issue 1 February 2017 65

Firstly, we have already explained the structure of Multilayer Perceptron (MLP) networks and seen how MLP can learn to approximate functions. Secondly, we will have a general look about another and different method named "Radial-Basis Functions neural networks (RBF) ". The best way of approximation is using RBF networks extensively in various fields of science and engineering applications (Benghanem and Mellit, 2010; Mellit and Kalogirou, 2008) as well as a high potential for approximation, simplicity of structure,learning algorithms faster and resulted from the theory of function approximation.[7]the basic specifications of (RBF) Networks are: a)its a feedforward network type.b) The application of sum of radial basis functions emerged clearly in the hidden nodes. c)the output nodes in this method like the output nodes in an MLP which implement linear summation functions. d)there are two steps of training: step one from the input layer the weights are specified then mapped to hidden layer, step two from the hidden layer the weights are specified then mapped to output layer. e) It is very fast method in the training/learning and very good at interpolation. Radial Basis Functions Experimental and theoretical studies refer to characteristics of the interpolating caused a relative insensitivity to the suitable shape of the basis functions Θ(x). The basis functions are as followed:[8] Multi quadric Functions: Θ(x) = ( x 2 + ) 1/2 with parameter λ> 0 Gaussian Functions: Inverse Multi quadric Functions: Θ(x) = ( x 2 + ) -1/2 with parameter λ> 0 Generalized Multi quadric Functions: Θ(x) = ( x 2 + ) α with parameters λ> 0, 1 >α > 0 Generalized Inverse Multi quadric Functions: Θ (x) = ( x 2 + ) -β with parameters λ> 0, β > 0 Thin Plate Spline Function Linear Function: Cubic Function: RBF Structure and design The RBF neural network is a feedforward network type.[5] It also divided into three layers like the Multilayer Perceptron (MLP).These layers known: input layer, output layer, and hidden layer between them is a single. Each of these performs a radial basis function. The RBF network has its origin in performing exact interpolation of a set of data points in a multi dimensional space.the exact interpolation means a particular situation as the outcome of the output function equal the data points in all precisely.[9] Officially, the exact interpolation of a set of N data points in a multi-dimensional space demands all the M dimensional input vectors x β = { : j =1,...,M} referring to the conforming outputs α β that is required. Aiming to get a function ƒ(x) as: β =1,...,N Radial Basis Function (RBF) method presents a set of N basis functions, every "j" stand for data point takes the style where is not linear function. Usually the distance taken to be Euclidean between x and c j. The output of the mapping is a linear mixture of the basis functions. (7) The RBF is illustrated in figure.3:we identify a set of inputs and outputs (x t, y t ) sequentially, and specify three various parameters called the location of the Gaussian kernels (C i ), the multiplicative factors (w i ), and their variances (Θ i ). [10,11] The approximation of y, by a RBF is known ƒ(x).the concerned approximation express the weight of N Gaussian kernels then the number of Gaussian kernels decided the complexity of this method. Volume 8 Issue 1 February 2017 66

Figure.3..Representation of a RBF III.EXPERIMENTAL RESULTS In Matlab, we are selecting two diferent functions which are one-dimensional and approximate them by: Multilayer Perceptron(MLP) and Radial-Basis Functions neural networks (RBF).( see figure.4) We selected three different configurations of both types of networks.evaluate error for the training and test set depending on the numberof training cycles, and keep neural network and results for them. Input the original one-dimensional function Input the number of neurons Input the number of hidden layers and number of cycles of training process (epochs) Apply MLP Algorithm Apply RBF Algorithm Getting approximation function and best training process once by MLP and another by RBF Approximation using Multilayer Perceptron Figure.4.The propose work in Matlab Volume 8 Issue 1 February 2017 67

The function is been approximated by MLP method.where: (Green color) represent the one-dimensional function which needed to be approximated, while (Red color) represent the result from the approximation process using this method (see figure.5 and figure.6). a- At the beginning we create a simple network with two neurons in one hidden layer and 300 cycles of training process (epochs),we will observe the emergence of a clear error in approximation. b- Consequently, we have increased the number of neurons to ten neurons in the hidden layer with still the same number of cycles. The network approximated function more perfect. c- Gradually increasing the number of neurons, increasing the number of hidden layers and the number of cycles of training process we achieve better approximation properties. The training and test sample deviations gradually reduced. a.approximation using MLP with two neurons in hidden layer (300cycles) b.approximation using MLP with ten neurons in the one hidden layer(300cycles) Volume 8 Issue 1 February 2017 68

c. Approximation using MLP with twenty neurons in the four hidden layers (1300 cycles) Figure.5. Approximation a one-dimensional function1 using MLP a.approximation using MLP with two neurons in hidden layer (300cycles) b. Approximation using MLP with ten neurons in the one hidden layer(300cycles) Volume 8 Issue 1 February 2017 69

c. Approximation using MLP with twenty neurons in the four hidden layers (1300 cycles) Figure.6. Approximation a one-dimensional function2 using MLP Approximation using Radial Basis Function neural networks (RBF) The function is been approximated by RBF network method.where we are taking the same number of neurons which was taken in the previous method, but with a single hidden layer and we will note that the approximation in this method is faster with better results. a- At the beginning we choose the RBF network with two neurons and cycles of training process(epochs), neural network could approximate the original function but with significant errors. b- Subsequently, we used ten neurons with ten number of cycles of training process (epochs).this network has been more precisely approximates the original function.compared with the previous method, RBF will faster the training process by 30-times and giving better result. c- At the last experiment, we conducted with twenty neurons and cycles of training process (epochs). The network in this case became more precise.compared with the previous method, RBF will faster the training process by 65-times. In spite of the increase in hidden layers in MLP, but the result of BRF is remained the best(see figure.7 and figure.8). a. Approximation using RBF network with two neurons Volume 8 Issue 1 February 2017 70

b. Approximation using RBF network with ten neurons c. Approximation using RBF network with twenty neurons Figure.7. Approximation a one-dimensional function1 using RBF network a. Approximation using RBF network with two neurons Volume 8 Issue 1 February 2017 71

b. Approximation using RBF network with ten neurons c. Approximation using RBF network with twenty neurons Figure.8. Approximation a one-dimensional function2 using RBF network IV.CONCLUSION Multilayer Perceptrons and Radial Basis Function networks are universal approximators and both of these methods are feedforward type network but with some differences include: 1) An MLP has one or more hidden layers, whereas an RBF network (In its simplest form) has one hidden layer. 2) In the MLP, the hidden and output layers used as a pattern classifier usually are not linear. Nevertheless, in the RBF network the hidden layer is not linear, while the output is linear. When we use the MLP to solve not linear regression problems, usually a linear layer for the output is the preferred selection. 3) The map of MLP not linear input/output structured global approximations. Form the other point of view "Gaussian function" declare the structure of not linear input/output local approximations which the not linearties used by RBF network. The results of this work suggest that RBF method provides more accurate results although MLP used more than one hidden layer and it has a much faster training process by 65-times than the MLP method. RBF need a limit number of training cycles and limit neuron in one hidden layer to an approximate a one-dimensional Volume 8 Issue 1 February 2017 72

function. This property has made RBF method used as substitute for MLP method to approximate a onedimensional function. REFERENCES [1].Werbos P., Beyond Regression: New Tools for Prediction and Analysis in The Behavioral Sciences, PhD thesis, Harvard University, 1974. [2].Rumelhart D.and Hinton G., Williams R, Learning Representation by Back Propagating Errors, Nature 323, pp. 533-536, 1986. [3].Yue Wu,1 Hui Wang,1 Biaobiao Zhang,1 and K.-L. Du1," Using Radial Basis Function Networks for Function Approximation and Classification ", ISRN Applied Mathematics,2012. [4].Paavo Nieminen," Classification and Multilayer Perceptron Neural Networks",Data Mining Course (TIES445), 2012. [5].Olanrewaju A Oludolapo and Adisa A Jimoh, Pule A Kholopane," Comparing Performance of MLP and RBF Neural Network Models for Predicting South Africa s Energy Consumption", Journal of Energy in Southern Africa, 23, 3,pp. 42-45, August 2012. [6].Mark Gales, "Multi-Layer Perceptrons ", Module 4F10: Statistical Pattern Processing, University of Cambridge Engineering Part IIB,2015. [7].C. Lesage and M. Cottrell eds."approximation By Radial Basis Function Networks", Connectionist Approaches in Economics and Management Sciences, Kluwer academic publishers, pp. 203-214, 2003. [8].Haykin, S., "Neural Networks: A Comprehensive Foundation", NY:IEEE Press, 1994. [9].John A. Bullinaria" Radial Basis Function Networks: Introduction Neural Computation: Lecture 13", 2015. [10].Verleysen M. and Hlavackova K., An Optimised RBF Network for Approximation of Functions, ESANN, European Symposium on Artificial Neural Networks, Brussels (Belgium), pp. 175-180, 1994. [11].Benoudjit N., Archambeau C., Lendasse A.and Lee J., Verleysen M., Width Optimization of The Gaussian Kernels in Radial Basis Function Networks, ESANN,European Symposium on Artificial Neural Networks, Bruges (Belgium), pp. 425-432,2002. Volume 8 Issue 1 February 2017 73