Evolutionary Artificial Neural Networks For Medical Data Classification

Size: px
Start display at page:

Download "Evolutionary Artificial Neural Networks For Medical Data Classification"

Transcription

1 Evolutionary Artificial Neural Networks For Medical Data Classification GRADUATE PROJECT Submitted to the Faculty of the Department of Computing Sciences Texas A&M University-Corpus Christi Corpus Christi, Texas In Partial Fulfillment of the Requirements for the Degree of Master of Science in Computer Science By Narendra Kumar Reddy Obbineni Spring 2017 Committee Members Dr. Ajay Katangur Committee Chairperson Dr. David Thomas Committee Member

2 ABSTRACT Artificial Neural Networks have been increasingly used in classifying of medical data as they perform consistently better than other techniques like K-Nearest Neighbor, Decision Trees and Support Vector Machines for plain data. This project utilizes Biogeography Based Optimizer and Genetic Algorithm techniques to train a Multilayer Perceptron using the Pima Diabetes dataset using the factors such as Number of Pregnancies, Glucose, Blood Pressure, Triceps Skin Thickness, Insulin, Body Mass Index, Diabetes Pedigree Function and Age to predict whether the female patients has diabetes. The user can interact with the application either through the GUI or the Matlab IDE [1]. The user can also run the application using K- Fold cross validation technique to get a more precise estimate model of the classification algorithm prediction. 2

3 TABLE OF CONTENTS 1 INTRODUCTION BACKGROUND & RATIONALE MULTILAYER PERCEPTRON Weights and Biases Training Deep Learning BIOGEOGRAPHY BASED OPTIMIZER Overview of the BBO Algorithm: GENETIC ALGORITHMS Selection in Genetic Algorithm: Crossover in Genetic Algorithm: Mutation in Genetic Algorithm: NARRATIVE Problem Statement: Project Objective: Scope of the Project: Project Functionality: PROPOSED SYSTEM DESIGN Use Case Diagram

4 7.2 Class Diagram Sequence Diagram User Interface BBO Options: GA Options: General Options Output Window Output of Convergence Curves and Classification Accuracies: IMPLEMENTATION OF THE APPLICATION MODULES Normalizing the Dataset provided Using the GUI to input the Algorithm & General Options: Using the Matlab IDE to edit the Algorithm & Data Options Running the application with a K-Fold Cross Validation Technique: Initiating the training of the MLP with BBO & GA Initializing the MLP: Calculating the output using Multilayer Perceptron: Calculating the cost of a Generation: Training the MLP using Biogeography Based Optimizer Training the MLP using Genetic Algorithms: Testing the MLP on the testing dataset: Displaying the convergence curves & classification rates in a new GUI

5 9 TESTING AND EVALUATION Normalizing data before the application is run: Running the application from a GUI Running the application from Matlab IDE: Running the application by changing the values of BBO, GA and General parameters: Testing the Application with a new set of parameters: Running the application using the K-Fold Cross Validation (K-Fold Value as 10): Running the application using the K-Fold Cross Validation (K-Fold Value as 5): The Comparison of Convergence Curves for the K-Fold Run: The Comparison of Convergence Curves for the Multiple Populations Run: CONCLUSION FUTURE WORK Bibliography

6 LIST OF FIGURES Figure How the species move between habitats Figure The probability that a habitat contains s species Figure Single Point Crossover Figure Two Point Crossover Figure Uniform Crossover Figure Mutation example Figure Use Case diagram for Training of Multilayer Perceptron with BBO & GA.. 22 Figure Class Diagram for training the Multilayer Perceptron using BBO and GA Figure Sequence diagram for training the Multilayer Perceptron using BBO and GA 24 Figure The user interface for entering the Algorithm and Data Options Figure BBO Options Figure Genetic Algorithm Options Figure General Options Figure Output Window Figure Cost Convergence curve and Classification Accuracies barchart Figure Function to normalize diabetes dataset Figure Code Snippet for the GUI Figure Code Snippet of the Main Class Figure Code Snippet for setting the General Options Figure Code Snippet for saving the training and testing data options into a Struct

7 Figure Code Snippet for saving the BBO options into a Struct Figure Code Snippet for saving the GA options into a Struct Figure Visual representation of how the K-Fold Cross Validation happens Figure Code snippet that will display all the points where the data array will be split 37 Figure Code snippet that performs the K-Fold Cross validation splitting and starts the application Figure Code snippet where the training of the MLP will be initiated Figure Code snippet where the weights and hidden node biases are initialized Figure Code snippet where the output of the multilayer perceptron is calculated Figure Code snippet where the cost for every member of the population is calculated 42 Figure Code snippet for calculating the species count of all the islands Figure Code snippet for calculating the immigration and emigration rates Figure Code snippet for calculating the immigration and emigration rates Figure Code snippet to mutate worst half of the habitats Figure Code snippet to replace worst performers of current generation with best performers of the previous generation Figure Code snippet for Selection Figure Code snippet for Crossover Figure Code snippet for Mutation Figure Code snippet for determining the classification rates Figure Code snippet for displaying the cost convergence curves and classification 7

8 rates Figure Loading the GUI Figure Matlab Canvas Editor Figure The application GUI Figure Cost Convergence Curve and Classification Accuracies Figure Classification Accuracies displayed in Matlab Console Figure Cost Convergence Curve and Classification Accuracies Figure Options GUI Figure Cost Convergence Curves and Classification Accuracies Figure Options GUI Figure Cost Convergence Curve and Classification Accuracies Figure Classification Accuracies displayed in the console Figure Classification accuracies displayed at the console Figure BBO Cost Convergence Curves Figure GA Cost Convergence Curves Figure Cost Convergence Curves for Multiple Population Runs

9 LIST OF TABLES Table Default Parameters for the application Table Parameters that have been changed for the run Table Parameters that have been changed for the run Table Parameters that have been changed for the run Table Parameters changed for the run Table K-Fold Cross Validation Results

10 1 INTRODUCTION In this project, a Matlab application is developed for training the multilayer perceptron [2] using techniques like Biogeography based Optimizer (BBO) and Genetic Algorithms (GA). The application has a GUI that has been developed on Matlab platform. The GUI allows the user to input various BBO algorithm options such as Habitat Modification Probability, Initial Mutation Probability, Maximum Immigration Rate for each Island, Maximum Emigration Rate for each Island, Lower bound for immigration probability per gene, Upper Bound for immigration probability per gene, Number of best habitats to persist between generations and various GA options such as Crossover type (Single Point, Two Point, Uniform), Crossover probability, Mutation probability, Number of best individuals to keep from one generation to the next. The GUI allows the user to select the number of records to be considered for training, testing and the number of input parameters in the dataset, as well as the number of hidden nodes for the multi-layer perceptron, population size and the number of Generations the BBO and GA algorithms will be run to train the multilayer perceptron. The GUI also displays the output back from the classification program. The user also has the option to run the program directly from the Matlab IDE. In this case, the program reads the varied BBO and GA algorithms options as well as the General options to train the multilayer perceptron from a text file. The user will also have the option to select whether to run the program using the K-Fold Cross Validation Technique [3]. The program has been trained and tested with a total of 770 data records. If the program is selected to being run just once then 693 data records will be used for training and 77 data records will be used for testing by default. If the K-Fold cross validation technique option is being chosen then the data set will be divided into 10 equal parts by default and the program will be run 10 times. During each run one of the blocks will be used for training cycling through all the blocks during the 10 runs while the other 9 blocks will be used for testing. 10

11 2 BACKGROUND & RATIONALE Artificial neural network (ANNs) [4] are a mathematical model, which uses artificial neurons inspired by the neurons and axons connectivity in the human brain. The reasoning for the ANNs is that if the computers were more like the brain they could be good at some of the things humans are good at like pattern recognition and classification. The basic block of a ANN is an artificial neuron. Each artificial neuron can have one or more weighted inputs feeding into it and making use of an activation function generates the output. ANNs are increasing being for machine learning as they can be trained using the existing dataset rather than through detailed logic. ANNs are good at universal approximation as they can learn over multiple generations on how to provide a good mapping of the data they train on related to the output the data expects. A basic ANN might consist of multiple layers (A layer is a set of neurons arranged in a row) of artificial neurons where in each neuron in current layer connects to each neuron in the next layer thereby forming a network. One of the types of networks that is relevant to the current topic is the Feed forward neural network. In a Feed forward neural network the connections between the individual neurons/layers of the neurons do not form a cycle. A feed forward neural network typically consists of an input layer followed by one or more hidden layers and a final output layer. A set of inputs is passed to the first hidden layer and the activations from that layer are passed to the next layer and so on until you reach the output layer where the results of the classification are determined by the scores at each node. This happens for each set of inputs. This series of events starting from the input where each activation is sent to the next layer, and then the next, all the way to the output, is known as forward propagation. 11

12 3 MULTILAYER PERCEPTRON The first neural nets were born out of the need to address the inaccuracy of an early classifier, the perceptron. It is shown that by using a layered web of perceptrons [4], the accuracy of predictions could be improved. As a result, these new breeds of neural nets are called Multi-Layer Perceptron or MLP. 3.1 Weights and Biases Even though each node in a MLP has the same classifier and none of them fire randomly. If we repeat the input we get the same output. Each edge, which connects a neuron in a layer to a neuron in the next layer, has a unique weight. All neurons that receive their input from neurons in earlier nodes has a unique bias. This means that the combination used for each activation is also unique which is why the neurons fire differently. 3.2 Training The process of improving a neural net s accuracy is called training. The prediction accuracy of a neural nets depends on its weights and biases. For the neural network to predict a value that is as close to the actual output as possible the biases and outputs needs be changed slightly until the desired output is achieved. In training the neural net, the output from forward propagation is compared to the output that is known to be correct and the cost is the difference between the two. The point of training is to make that cost as small as possible, across hundreds of training examples. To do this, the neural network tweaks the weights and biases step by step until the prediction closely matches the correct output. Once trained well, a neural network has the potential to make accurate predictions each time. 3.3 Deep Learning To analyze simple patterns, a basic classification tool like an SVM or Logistic 12

13 Regression is typically good enough. But when the data has 10s of different inputs or more, neural nets starts to win out over other methods. Still, as the patterns get even more complex, shallow neural networks with a small number of layers can become unusable - the only practical choice is a deep net [5]. One of the key reasons that the deep net can recognize these complex patterns is because the deep nets are able to break the complex patterns down into a series of simpler patterns. For example, let s say that a neural network had to decide whether an image contained a human face. The deep net would first use edges to detect different parts of the face - the lips, nose, eyes, ears and so on - and would then combine the results together to form the whole face. This important feature to use simpler patterns as building blocks to detect complex patterns is what gives deep nets their strength. The accuracy of these deep nets has been very impressive. 13

14 4 BIOGEOGRAPHY BASED OPTIMIZER Biogeography-based optimizer (BBO) is an optimization algorithm [6], which has been inspired by parts of biological evolution how organisms migrate between islands (also known as habitats) and evolve. It arrives at the global optima for a function by randomly determining the possible solutions and then iteratively improving them (by creating new solutions or moving solutions between habitats) based on how closely the proposed solutions are to the expected solutions. Figure How the species move between habitats Habitat Suitability Index (HSI) denotes how supportive to life an island is based on suitability index variables (SIV) such as rainfall, vegetation, topography, temperature etc. Islands with high HSI have more population because of the supporting factors for survival and reproduction. Species move from high HSI to low HSI as islands with high HSI have too many species. In the BBO algorithm when species migrate from an island it is assumed that the species are extinct on that island. 14

15 3.3.2 P s - The probability that a habitat contain s species is calculated as shown in Figure Figure The probability that a habitat contains s species 4.1 Overview of the BBO Algorithm 1. Compute initial set of parameters (weights and biases) 2. Calculate the Habitat Suitability Index (or fitness) for each of the habitats 3. Compute the immigration rate, emigration rate and number of species for all solution. 4. Move species between habitats based on the immigration and emigration rate. 5. Mutate the worst 1/2 of the population based on the fitness function. 6. Replace the worst habitats of the current generation with the best habitats of the previous generation. 7. Repeat from step (2) until desired number of generations is reached 15

16 5 GENETIC ALGORITHMS Genetic Algorithm [7] is one of the evolutionary algorithms inspired by the evolutionary process of natural selection. Genetic algorithms approach the solution to the optimization problems by initially creating a random set of solutions and evolves it toward better solutions by using techniques such as Selection, Crossover (how the parents are combined to form children), Mutation (changes to parts of the parents to form children). 5.1 Selection in Genetic Algorithm In this stage, the individuals of the current generation will be selected and then passed on to the crossover stage to produce individuals for the next generation. One of ways to this can be by sorting the individuals by their fitness values and selecting two sets of individuals starting from the 1st individual until the end. 5.2 Crossover in Genetic Algorithm In this stage, the individuals who have been passed on from the selection process (parents) will be combined to form new individuals (children). Each child will share at least some of the characteristics from both parents. The techniques that can be used to perform the crossover are Single-Point, Two-Point and Uniform etc. Single Point: A single point is selected on each of the parents and one of the parts are swapped in the parents results in the children. The example for Single-point crossover is shown in the Figure

17 Figure Single Point Crossover Two Point: Two points are selected on each of the parents and the part between them is swapped resulting in the children. The example for Two-point crossover is shown in the Figure Figure Two Point Crossover Uniform: In This technique, a random value is generated and compared to an initial agreed upon crossover probability. If the random value is greater than the probability then no swapping will result for that variable. But if the random value is less than the probability then the parent variables are swapped and then assigned to the children. The example for Uniform crossover is shown in the Figure

18 Figure Uniform Crossover 5.3 Mutation in Genetic Algorithm In this stage, a set of variables of the individual will be replaced by new variables. The number of variables that will be replaced will be based on the mutation probability. The example for mutation is shown in the figure Figure Mutation example 18

19 6 NARRATIVE 6.1 Problem Statement Arizona Pima Indians as a group has the highest rate of diabetes in the United States compared to genetically similar Pima Indians in Mexico. In 1890 when the water Arizona Pima Indians drink has changed and shift in a diet from traditional to a diet laden with high fat and sugar, the percentage of population with obesity and diabetes has risen. A subset of Pima Indians who has immigrated to Mexico and maintained a traditional diet has remained healthy. The diabetes dataset of 21 year and older Arizona Pima Indian women has been provided based on 8 properties like Number of times pregnant, Plasma Glucose Concentration, Diastolic Blood Pressure, Triceps skinfold thickness, 2-h serum insulin, Body Mass Index, Diabetes pedigree function and Age. The output has also been provided. If the woman has diabetes the value is 1 and 0 otherwise. 6.2 Project Objective The main objective of the proposed system is to train a multi perceptron [8] using techniques such as BBO [9] and GA on a subset of data provided and test on the remaining data to see how viable the techniques are. At the end of the run the proposed system generates how well each of BBO and GA performed in the test. 6.3 Scope of the Project With the explosion of medical data generated, tools are in dire need to make sense of the data. Neural networks have been increasingly used in training medical data as they can be trained using data alone and their classification is reliable. The network thus developed in this project can be used to train other types of medical data in the future. 19

20 6.4 Project Functionality The project has been developed in Matlab. The following are the functionalities of the application. 1. Ability to run the application from a GUI. 2. Ability to input BBO, GA algorithm as well as General Training and Testing data options on the GUI before the run. 3. Ability to run the application from the Matlab IDE. 4. Ability to read/edit the BBO, GA algorithm variables as well as General Training and Testing Options from the text file when the application is run from the Matlab IDE. 5. Ability to run the application using K-Fold cross validation from the Matlab IDE. 6. Ability to set the number of K-Fold cross validation runs. 20

21 7 PROPOSED SYSTEM DESIGN 7.1 Use Case Diagram The use case diagram for training of the Multi-Layer Perceptron using BBO and GA algorithms is shown in figure All the ovals represent the functionality for the user. The user can run the application by using the GUI or running from the Matlab IDE. When the user chooses to run the application through the GUI all the General Data Options, BBO algorithm options and GA options will be sent to the Matlab program and the program starts training the multilayer perceptron and the program output will be sent back to the GUI. In the new tab of the Matlab GUI called Output all the output sent back from the program will be displayed. If the user chooses to edit the data options from a text file and run the program from Matlab IDE then the output of the training will be displayed to the Matlab IDE console. The convergence curves and the classification rates for BBO and GA will be displayed in the new GUI at the end of the run for both ways the user selects to run the application. 21

22 Figure Use Case diagram for Training of Multilayer Perceptron with BBO & GA 7.2 Class Diagram The class diagram shown in Figure displays the most important parts of the application such as the Main class which initiates the training of the BBO and GA algorithms. BBO and GA algorithm in turn calls the MLP Trainer class [10] to train the multilayer perceptron over multiple generations. After the training is done, the Main class calls the MLP Trainer class to identify how well the trained multilayer perceptron performs on the testing data. 22

23 Figure Class Diagram for training the Multilayer Perceptron using BBO and GA 7.3 Sequence Diagram The sequence diagram displays how the communication happens between the user and the application. Shown in the Figure is the sequence diagram of how the communication happens when the user inputs the General data options as well as BBO and GA options on the Matlab GUI and runs the application. As shown in the Figure 7.3.1, over multiple generations BBO and GA will train the Multilayer Perceptron and the output will be displayed at the end 23

24 of each generation back to the GUI window. After the MLP has been trained and tested, the convergence and classification data will be displayed in a GUI window. Figure Sequence diagram for training the Multilayer Perceptron using BBO and GA 7.4 User Interface The user interface shown in Figure has been developed using the Matlab GUI editor. It initially loads with a set of BBO, GA and General options. 24

25 Figure The user interface for entering the Algorithm and Data Options 7.5 BBO Options The BBO options that can be changed are Habitat Modification Probability, Initial Mutation Probability, and Maximum Immigration rate for island, Maximum Emigration rate for island, Lower Bound for immigration probability per gene, Upper bound for immigration probability per gene, Number of best habitats to persist from one generation to the next. The GUI for the BBO options is shown in the figure

26 Figure BBO Options 7.6 GA Options The GA options that can be changed on the GUI are Crossover Type (one of the options among single point, two point and uniform ), Crossover Probability, Initial mutation probability and Number of best individuals to persist from one generation to the next. The GUI for GA options is shown in figure

27 Figure Genetic Algorithm Options 7.7 General Options The General Options that can be changed on the GUI are Number of Data Records, Number of Records for training, Number of input Parameters, and Number of hidden Nodes, Population Size and Number of Generations. The General options GUI is shown in figure

28 Figure General Options 7.8 Output Window After the user clicks Run, all the options on the GUI will be sent to the application to be run against. Once the application starts running it will send the output back to the GUI. The sample output displayed in the output window is shown in the Figure

29 Figure Output Window 7.9 Output of Convergence Curves and Classification Accuracies After the application is finishing training and testing, the resulting convergence curves and classification accuracies will be displayed in a new window. In convergence curve the number of generations are plotted against the cost of each generation. The classification bar chart is plotted against the algorithm used to train the MLP against the Classification accuracies of said algorithm. Classification accuracies are denoted in percentages. A sample GUI of Convergence Curves and Classification Accuracies is shown in Figure

30 Figure Cost Convergence curve and Classification Accuracies barchart 30

31 8 IMPLEMENTATION OF THE APPLICATION MODULES 8.1 Normalizing the Dataset provided The Arizona Pima Indian diabetes dataset provided will be normalized in this step. The original diabetes dataset provided is saved in diabetes.csv file. All the columns in the dataset except the outcome will be normalized using the Min-Max normalization technique. Some of the input parameters are mentioned as 0 in the dataset. Those values are set to NaN before normalization. To get a min-max normalized value of a value in the column the value will be subtracted with the minimum value of the column and divided by the difference of the maximum value of the column and the minimum value of the column. The code snippet for the normalization function is shown in Figure

32 Figure Function to normalize diabetes dataset 8.2 Using the GUI to input the Algorithm & General Options: The GUI has been developed using the Matlab drag and drop editor [11]. All the data options entered on the interface will be part of the app. When the user clicks Run - The 32

33 app is being sent to the Main class where the training process of MLP can start. In the Main class the app variables are being read and then assigned to the local variables of the class and hence used by the Main class and the functions it calls. The code snippet for the GUI is shown in Figure Figure Code Snippet for the GUI The code snippet for the Main Class where the app is being read for the algorithm and data options is shown in Figure

34 Figure Code Snippet of the Main Class 8.3 Using the Matlab IDE to edit the Algorithm & Data Options The second way to start the application will be by using the Matlab IDE. The user can edit the BBO, GA and General Options and call the Main class where the training of the MLP can begin. Global variables in Matlab can be used to edit and fetch the value of the variable across the entire project. A global struct called OPTIONS is being declared here which will be filled up with the Algorithm & Data options. The struct OPTIONS will be read across the project to fetch the Algorithm & Data options. The code snippet which shows how the General Options can be set when the application is run from Matlab IDE is shown in the Figure

35 Figure Code Snippet for setting the General Options The code snippet which shows how the General data options are being saved to the global struct OPTIONS is shown in the Figure Figure Code Snippet for saving the training and testing data options into a Struct The code snippet which shows how the BBO algorithm options are being saved to the 35

36 global struct OPTIONS is shown in Figure Figure Code Snippet for saving the BBO options into a Struct The code snippet which shows how the GA algorithm options are being saved to the global struct OPTIONS is shown in Figure Figure Code Snippet for saving the GA options into a Struct 8.4 Running the application with a K-Fold Cross Validation Technique During the K-Fold Cross Validation technique, the dataset of 770 records will be split 36

37 into 10 groups [3]. The application will be run 10 times. During each of the runs one of the groups will be selected as the testing group starting from the first group and the other 9 groups will be used to train the MLP. Figure Visual representation of how the K-Fold Cross Validation happens in Figure The snippet of the code showing all the positions where the array will be split is shown Figure Code snippet that will display all the points where the data array will be split 37

38 The snippet of the code, which performs the K-Fold, cross validation splitting and starts the application is shown in Figure Figure Code snippet that performs the K-Fold Cross validation splitting and starts the application 8.5 Initiating the training of the MLP with BBO & GA After the algorithms and general options has been loaded into the Matlab space, the training of the MLP using BBO and GA will be initiated. The snippet of code where the training is initiated is shown in the Figure

39 Figure Code snippet where the training of the MLP will be initiated 8.6 Initializing the MLP The MLP will have three layers of nodes. It will be initialized according to the values mentioned in the General data options. The first layer consists of input nodes [ref: No. Of Input Params in the GUI]. The second layer consists of hidden nodes [ref: No. Of hidden nodes in the GUI] and the third layer consists of a single output node [according to the Diabetes dataset]. All the input nodes connect to each hidden node and all hidden nodes will connect to the output node. Each hidden node will also have a bias term. Connections are represented as weights in the application. All the weights and bias terms will be initialized during the creation of the MLP for each individual member of the population [ref: Population Size in GUI]. The number of variables that needs to be initialized for each member of the population will be (No. of input nodes * No. of hidden nodes) + No. of hidden nodes + No. of hidden nodes The code snippet where the weights and hidden node biases are initialized is shown in Figure

40 Figure Code snippet where the weights and hidden node biases are initialized 8.7 Calculating the output using Multilayer Perceptron The output of the MLP will be calculated upon the following parameters. A single row from the dataset will be the set of inputs. The linear of combination of inputs and weights from the input layer to the hidden node is calculated. The input to the hidden node will be the linear combination calculated earlier and the bias term. The output of the hidden node will be the sigmoidal function of the input to it. The output function of the hidden node is represented as 1/(1 + e x ). The Linear combination of all the hidden node outputs and weights from the hidden layer to the output node is calculated. The input to the output node will be the linear combination calculated earlier and the bias term. The output of the output node will be the sigmoidal function of the input to it. The output function of the output node is represented as 1/(1 + e x ). The code snippet where the output of the multilayer perceptron is calculated based on the input values, the weights and bias terms is shown in the Figure

41 Figure Code snippet where the output of the multilayer perceptron is calculated 8.8 Calculating the cost of a Generation The goal of training the MLP is to get the output of the MLP as close to the expected output as possible. The cost variable tells us how close the output of the MLP is to the expected output. In a Generation, the cost will be calculated across all the training datasets and taking all the members of the population into account. Error is the difference between the expected output and the output of the MLP. The mean squared error will be calculated for all 41

42 the training datasets and for all members of the population. The code snippet for the cost calculation for every member of the population is shown in the Figure Figure Code snippet where the cost for every member of the population is calculated 42

43 8.9 Training the MLP using Biogeography Based Optimizer In the context of BBO, the Population Size mentioned in the GUI is mapped to the islands. In the first step, a set of variables (weights & biases for the MLP) for every member of the island will be initialized. The cost will be calculated for each member of the island using the fitness function. The island with lower cost has a high Habitability Suitability Index (HSI) hence more species exist on that island. If the islands are sorted based on cost in an ascending order, then their respective species counts will be in descending order. The code snippet for calculating the species count of all the islands is shown in Figure Figure Code snippet for calculating the species count of all the islands After the species count has been initialized, the immigration rate and emigration rate for all islands will be calculated. The code snippet for calculating the immigration and emigration rates is shown in Figure

44 Figure Code snippet for calculating the immigration and emigration rates The immigration and emigration rates will be used to move species between islands. The code snippet for exchanging species between islands is shown in Figure Figure Code snippet for calculating the immigration and emigration rates 44

45 Based on the fitness function, the worst half of the habitats will be mutated. The code snippet to mutate worst half of habitats is shown in the Figure Figure Code snippet to mutate worst half of the habitats The variables of the two worst performing islands of the current generation will be replaced by the variables of the two best performing islands from the previous generation. The code snippet for replacing the two worst performing islands of the current generation with the two best performing islands from the previous generation is shown in the Figure Figure Code snippet to replace worst performers of current generation with best performers of the previous generation All the above steps will be repeated until the desired number of generations is reached. 45

46 8.10 Training the MLP using Genetic Algorithms During the first stage, the MLP will be initialized for all members of the population with random weights and biases. After the initialization, the cost will be calculated for each member of the population. The fitness score for each member will be the inverse of its cost. The training of the MLP using GA will be done over multiple generations. In every generation, the current population will go through selection, crossover and mutation to generate a set of new populations. In the process of selection, two of the members of the population will be selected (called parents) based on technique based on roulette wheel of Inverse Costs. The code snippet for selecting 2 of the members from a current set of populations based on their inverse costs is shown in the Figure Figure Code snippet for Selection The members of the population selected will be passed on to the Crossover stage. Crossover can be done using either one of the techniques such as single point crossover, multipoint crossover and uniform crossover. The code snippet for single point crossover is 46

47 shown in the Figure Figure Code snippet for Crossover The parent population, besides the top performing members ( the best performing members to persist from one generation to the next ) from the previous generation, will be replaced with the new members obtained by the crossover. After the crossover, the members of the population, besides the top performing members ( the best performing members to persist from one generation to the next ) from the previous generation will be mutated based on the mutation probability. The code snippet for mutating the members of the population is shown in the Figure Figure Code snippet for Mutation 47

48 Mutation will finish the process of generating the new population. All the above steps will be performed until the desired number of generations is reached Testing the MLP on the testing dataset After the training of the MLP has been finished, the MLP will be tested for determining the classification accuracies of the BBO and GA. The classification accuracies need to be determined for the testing dataset. The input parameters of each row in the training dataset will be sent to the MLP and if the output of the MLP is close to the desired output (the absolute difference between expected output and generated output should be less than 0.1), then the classification is a success otherwise it is a failure. The code snippet for determining the classification accuracies is shown in Figure

49 Figure Code snippet for determining the classification rates 8.12 Displaying the convergence curves & classification rates in a new GUI After the testing of the MLP has been done and classification rates has been determined the cost of MLP with BBO and GA across multiple generations needs to be displayed as a convergence curve. The classification rates will be displayed in bar chart for BBO and GA. The code snippet that displays the cost convergence curves and classification rates in the new GUI is shown in the Figure

50 Figure Code snippet for displaying the cost convergence curves and classification rates 50

51 9 TESTING AND EVALUATION In this chapter, functional evaluation of the product will be discussed. The application has been tested using Matlab version R2017a ( ). 9.1 Normalizing data before the application is run The Arizona Pima Indian diabetes dataset will be normalized before it can be used to train a MLP. The original dataset is saved as diabetes.csv. Normalized dataset will be saved as diabetesprocessed.csv. All the columns of the dataset will be normalized using the Min- Max normalization function. Running the NormalizeData class from the Matlab IDE will normalize the data. 9.2 Running the application from a GUI The GUI canvas designer can be shown by selecting Open from the Matlab window and selecting the file MLP_BBO_GA.mlapp is shown in Figure Figure Loading the GUI The file MLP_BBO_GA.mlapp will be displayed in the Matlab GUI canvas designer 51

52 as shown in Figure Figure Matlab Canvas Editor Clicking Run will display the window where the user input the Algorithm and Data Options is shown in Figure

53 Figure The application GUI 53

54 The default parameters of the application are shown in the Table Table Default Parameters for the application Parameter Value Habitat Modification Probability (BBO) 1 Initial Mutation Probability (BBO) Max. Immigration rate for island (BBO) 1 Max. Emigration rate for island (BBO) 1 Lower Bound for immigration probability per gene (BBO) 0 Upper Bound for immigration probability per gene (BBO) 1 No. of top habitats to persist between generations (BBO) 2 Crossover Type (GA) Single Point Initial Mutation Probability (GA) 0.01 No. of Top individuals to persist between Generations (GA) 2 No. of Data Records (General) 770 No. of Records for Training (General) 693 No. of Input Params (General) 8 No. of Hidden Params (General) 19 Population Size (General) 50 Generations (General) 249 Upon clicking Run, the application will start training the MLP using BBO and GA. After the MLP has been trained using the BBO and GA using the training data set, the remaining dataset will be used to test the MLP. After the testing is done, the classification accuracies will be calculated. The cost convergence curves along with the classification accuracies will be displayed in a new window as shown in Figure

55 0.035 Convergence Curves BBO GA Classification Accuracies MSE Classification rate (%) Generation 0 BBO GA BBO GA BBO Algorithm Figure Cost Convergence Curve and Classification Accuracies 9.3 Running the application from Matlab IDE The user can run the application from the Matlab IDE. The starting point for the application is RunFromTextFile class. The user can edit the algorithm and data options in this file directly After editing the options and saving the file, the user can start the application by clicking the Run in the toolbar of the Matlab IDE. The application will start and the MLP will be trained using the BBO and GA algorithms on the training dataset. After the MLP has been trained, the MLP will be tested using the testing dataset for calculating the classification accuracy. After the testing has finished, the classification accuracies will be displayed to the console as shown in Figure Figure Classification Accuracies displayed in Matlab Console 55

56 The cost convergence curves and the classification rates for BBO and GA will be displayed in a new GUI as shown in Figure Convergence Curves BBO GA Classification Accuracies MSE Classification rate (%) Generation 0 BBO GA BBO GA BBO Algorithm Figure Cost Convergence Curve and Classification Accuracies 9.4 Running the application by changing the values of BBO, GA and General parameters The parameters have been changed on the GUI are shown in Table Table Parameters that have been changed for the run Parameter Value Habitat Modification Probability (BBO) 0.5 (Default: 1) Crossover Type (GA) Two Point (Default: Single Point ) No. Of hidden Nodes (General Options) 29 (Default: 19 ) 56

57 The options GUI with the parameter values changed is shown in Figure Figure Options GUI After the application has finished running with these parameters, the cost convergence curves and classification rates are shown in the Figure

58 0.022 Convergence Curves BBO GA Classification Accuracies MSE Classification rate (%) Generation 0 BBO GA BBO GA BBO Algorithm Figure Cost Convergence Curves and Classification Accuracies 9.5 Testing the Application with a new set of parameters The parameters have been changed on the GUI are displayed in Table Table Parameters that have been changed for the run Parameter Value Max. Immigration rate for island (BBO) 0.5 (Default: 1) Max. Emigration rate for island (BBO) 0.5 (Default: 1) Initial Mutation Probability (GA) 0.02 (Default: 0.01) No. Of Generations (General Options) 349 (Default: 249) 58

59 The options GUI with the parameter values changed is shown in Figure Figure Options GUI After the application has finished running with these parameters, the cost convergence curve and classification rates are shown in the Figure

60 Figure Cost Convergence Curve and Classification Accuracies 9.6 Running the application using the K-Fold Cross Validation (K-Fold Value as 10) To run the application using K-Fold means cross validation, the value of the variable iskfoldon in the class RunFromTextFile needs to be set to 1. The parameters that needs to be set before the run are shown in the Table Table Parameters that have been changed for the run Parameter Value iskfoldon 1 (Default: 0) numberofkfolds 10 (Default: 10) The Application can start using the Run in the toolbar of the Matlab IDE. The 60

61 application starts training the MLP using BBO and GA on the training dataset. After the training has finished, the MLP will be tested using the testing dataset to get the classification accuracies. At the end of the K-Fold Run, the classification accuracies for the 10 distinct testing datasets along with the average classification rate will be displayed at the console. The classification accuracies of the BBO are displayed in the first column and the classification accuracies of GA are displayed in the second column as shown in the Figure Figure Classification Accuracies displayed in the console 9.7 Running the application using the K-Fold Cross Validation (K-Fold Value as 5) To run the application using the k-fold cross validation by splitting the dataset in to 5 groups, the parameters that needs to be changed in the RunFromTextFile file are shown in Table Table Parameters changed for the run Parameter Value iskfoldon 1 (Default: 0) 61

62 numberofkfolds 5 (Default: 10) At the end of the K-Fold Run, the classification accuracies for the 5 distinct testing datasets along with the average classification rate will be displayed at the console as shown in Figure Figure Classification accuracies displayed at the console 9.8 The Comparison of Convergence Curves for the K-Fold Run The convergence curves for the K-Fold Run has been compared with the Value K being 10. The Convergence Curves for BBO are shown in Figure

63 MSE MSE Generation Generation MSE - Average MSE - Best Generation Generation Figure BBO Cost Convergence Curves The Convergence Curves for GA are shown in Figure MSE MSE Generation Generation MSE - Average MSE - Best Generation Generation Figure GA Cost Convergence Curves 63

64 In the images, the top left plot displays the convergence curves for the first 5 runs. The top right plot displays the convergence curves for the next 5 runs. The bottom left plot displays the average of the convergence curves and the bottom right plot displays the best of the convergence curves. 9.9 The Comparison of Convergence Curves for the Multiple Populations Run The comparison of Convergence Curves for runs with population sizes as 50, 100, 200 and 400 is shown in the Figure Figure Cost Convergence Curves for Multiple Population Runs In the image, the top left plot is for the run with population size 50, the top right plot is for run with population size 100, the bottom left plot is for population run 100 and the bottom right plot is for population run

65 10 CONCLUSION We have got the classification results averaging about 71.5% during multiple runs. For the K-Fold cross validation the results have been displayed in the Table Table K-Fold Cross Validation Results K-Fold No BBO GA Average % The best classification accuracies we have received are % and % for BBO and GA respectively. For the K-Fold Cross Validation run, the average classification accuracies for BBO and GA are % and % for BBO and GA respectively. The results show that ANNs are indeed a viable method for classification. 65

66 11 FUTURE WORK The optimal values of the variables for the BBO options, GA options and Number of hidden nodes can be found out by changing the values of these variables across multiple runs and seeing which values are yielding the best classification rates for the dataset Architecture of the ANN is one of the crucial things, which can change the classification accuracies. So, multiple architectures can be developed and tested to see if they can improve the classification rate. 66

67 12 Bibliography [1] Mathworks, "Matlab," [Online]. Available: [2] Alex, "Neural Networks (Part 2) - Training," [Online]. Available: [3] "k-fold cross-validation," [Online]. Available: [4] Alex, "Neural Networks (Part 1)," [Online]. Available: [5] M. Nielsen, "Neural Networks and Deep Learning," [Online]. Available: [6] D. Simon, "Biogeography-Based Optimization," [Online]. Available: [7] Wikipedia, "Genetic Algorithms," [Online]. Available: [8] H. Temurtas, "A comparative study on diabetes disease diagnosis using neural networks," [Online]. Available: [9] S. Mirjalili, "Let a biogeography-based optimizer train your Multi-Layer Perceptron," [Online]. Available: [10] P. S. Sengupta, "Multi-Class Classification Using Multi-layered Perceptrons," [Online]. Available: [11] Mathworks, "MATLAB App Designer," [Online]. Available: 67

68 [12] S. Mirjalili, "Neural Networks Projects," [Online]. Available: 68

Chapter 5 OPTIMIZATION OF BOW TIE ANTENNA USING GENETIC ALGORITHM

Chapter 5 OPTIMIZATION OF BOW TIE ANTENNA USING GENETIC ALGORITHM Chapter 5 OPTIMIZATION OF BOW TIE ANTENNA USING GENETIC ALGORITHM 5.1 Introduction This chapter focuses on the use of an optimization technique known as genetic algorithm to optimize the dimensions of

More information

Stock Price Prediction Using Multilayer Perceptron Neural Network by Monitoring Frog Leaping Algorithm

Stock Price Prediction Using Multilayer Perceptron Neural Network by Monitoring Frog Leaping Algorithm Stock Price Prediction Using Multilayer Perceptron Neural Network by Monitoring Frog Leaping Algorithm Ahdieh Rahimi Garakani Department of Computer South Tehran Branch Islamic Azad University Tehran,

More information

Evolutions of communication

Evolutions of communication Evolutions of communication Alex Bell, Andrew Pace, and Raul Santos May 12, 2009 Abstract In this paper a experiment is presented in which two simulated robots evolved a form of communication to allow

More information

Artificial Neural Networks. Artificial Intelligence Santa Clara, 2016

Artificial Neural Networks. Artificial Intelligence Santa Clara, 2016 Artificial Neural Networks Artificial Intelligence Santa Clara, 2016 Simulate the functioning of the brain Can simulate actual neurons: Computational neuroscience Can introduce simplified neurons: Neural

More information

CHAPTER 6 BACK PROPAGATED ARTIFICIAL NEURAL NETWORK TRAINED ARHF

CHAPTER 6 BACK PROPAGATED ARTIFICIAL NEURAL NETWORK TRAINED ARHF 95 CHAPTER 6 BACK PROPAGATED ARTIFICIAL NEURAL NETWORK TRAINED ARHF 6.1 INTRODUCTION An artificial neural network (ANN) is an information processing model that is inspired by biological nervous systems

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Perceptron Barnabás Póczos Contents History of Artificial Neural Networks Definitions: Perceptron, Multi-Layer Perceptron Perceptron algorithm 2 Short History of Artificial

More information

AN IMPROVED NEURAL NETWORK-BASED DECODER SCHEME FOR SYSTEMATIC CONVOLUTIONAL CODE. A Thesis by. Andrew J. Zerngast

AN IMPROVED NEURAL NETWORK-BASED DECODER SCHEME FOR SYSTEMATIC CONVOLUTIONAL CODE. A Thesis by. Andrew J. Zerngast AN IMPROVED NEURAL NETWORK-BASED DECODER SCHEME FOR SYSTEMATIC CONVOLUTIONAL CODE A Thesis by Andrew J. Zerngast Bachelor of Science, Wichita State University, 2008 Submitted to the Department of Electrical

More information

CS 229 Final Project: Using Reinforcement Learning to Play Othello

CS 229 Final Project: Using Reinforcement Learning to Play Othello CS 229 Final Project: Using Reinforcement Learning to Play Othello Kevin Fry Frank Zheng Xianming Li ID: kfry ID: fzheng ID: xmli 16 December 2016 Abstract We built an AI that learned to play Othello.

More information

Creating a Poker Playing Program Using Evolutionary Computation

Creating a Poker Playing Program Using Evolutionary Computation Creating a Poker Playing Program Using Evolutionary Computation Simon Olsen and Rob LeGrand, Ph.D. Abstract Artificial intelligence is a rapidly expanding technology. We are surrounded by technology that

More information

IBM SPSS Neural Networks

IBM SPSS Neural Networks IBM Software IBM SPSS Neural Networks 20 IBM SPSS Neural Networks New tools for building predictive models Highlights Explore subtle or hidden patterns in your data. Build better-performing models No programming

More information

NEURAL NETWORK DEMODULATOR FOR QUADRATURE AMPLITUDE MODULATION (QAM)

NEURAL NETWORK DEMODULATOR FOR QUADRATURE AMPLITUDE MODULATION (QAM) NEURAL NETWORK DEMODULATOR FOR QUADRATURE AMPLITUDE MODULATION (QAM) Ahmed Nasraden Milad M. Aziz M Rahmadwati Artificial neural network (ANN) is one of the most advanced technology fields, which allows

More information

The Genetic Algorithm

The Genetic Algorithm The Genetic Algorithm The Genetic Algorithm, (GA) is finding increasing applications in electromagnetics including antenna design. In this lesson we will learn about some of these techniques so you are

More information

A Genetic Algorithm-Based Controller for Decentralized Multi-Agent Robotic Systems

A Genetic Algorithm-Based Controller for Decentralized Multi-Agent Robotic Systems A Genetic Algorithm-Based Controller for Decentralized Multi-Agent Robotic Systems Arvin Agah Bio-Robotics Division Mechanical Engineering Laboratory, AIST-MITI 1-2 Namiki, Tsukuba 305, JAPAN agah@melcy.mel.go.jp

More information

Analysis of Learning Paradigms and Prediction Accuracy using Artificial Neural Network Models

Analysis of Learning Paradigms and Prediction Accuracy using Artificial Neural Network Models Analysis of Learning Paradigms and Prediction Accuracy using Artificial Neural Network Models Poornashankar 1 and V.P. Pawar 2 Abstract: The proposed work is related to prediction of tumor growth through

More information

Application of Multi Layer Perceptron (MLP) for Shower Size Prediction

Application of Multi Layer Perceptron (MLP) for Shower Size Prediction Chapter 3 Application of Multi Layer Perceptron (MLP) for Shower Size Prediction 3.1 Basic considerations of the ANN Artificial Neural Network (ANN)s are non- parametric prediction tools that can be used

More information

MINE 432 Industrial Automation and Robotics

MINE 432 Industrial Automation and Robotics MINE 432 Industrial Automation and Robotics Part 3, Lecture 5 Overview of Artificial Neural Networks A. Farzanegan (Visiting Associate Professor) Fall 2014 Norman B. Keevil Institute of Mining Engineering

More information

Multiple-Layer Networks. and. Backpropagation Algorithms

Multiple-Layer Networks. and. Backpropagation Algorithms Multiple-Layer Networks and Algorithms Multiple-Layer Networks and Algorithms is the generalization of the Widrow-Hoff learning rule to multiple-layer networks and nonlinear differentiable transfer functions.

More information

THE problem of automating the solving of

THE problem of automating the solving of CS231A FINAL PROJECT, JUNE 2016 1 Solving Large Jigsaw Puzzles L. Dery and C. Fufa Abstract This project attempts to reproduce the genetic algorithm in a paper entitled A Genetic Algorithm-Based Solver

More information

Statistical Tests: More Complicated Discriminants

Statistical Tests: More Complicated Discriminants 03/07/07 PHY310: Statistical Data Analysis 1 PHY310: Lecture 14 Statistical Tests: More Complicated Discriminants Road Map When the likelihood discriminant will fail The Multi Layer Perceptron discriminant

More information

Figure 1. Artificial Neural Network structure. B. Spiking Neural Networks Spiking Neural networks (SNNs) fall into the third generation of neural netw

Figure 1. Artificial Neural Network structure. B. Spiking Neural Networks Spiking Neural networks (SNNs) fall into the third generation of neural netw Review Analysis of Pattern Recognition by Neural Network Soni Chaturvedi A.A.Khurshid Meftah Boudjelal Electronics & Comm Engg Electronics & Comm Engg Dept. of Computer Science P.I.E.T, Nagpur RCOEM, Nagpur

More information

COMPARATIVE STUDY ON ARTIFICIAL NEURAL NETWORK ALGORITHMS

COMPARATIVE STUDY ON ARTIFICIAL NEURAL NETWORK ALGORITHMS International Journal of Latest Trends in Engineering and Technology Special Issue SACAIM 2016, pp. 448-453 e-issn:2278-621x COMPARATIVE STUDY ON ARTIFICIAL NEURAL NETWORK ALGORITHMS Neenu Joseph 1, Melody

More information

A comparison of a genetic algorithm and a depth first search algorithm applied to Japanese nonograms

A comparison of a genetic algorithm and a depth first search algorithm applied to Japanese nonograms A comparison of a genetic algorithm and a depth first search algorithm applied to Japanese nonograms Wouter Wiggers Faculty of EECMS, University of Twente w.a.wiggers@student.utwente.nl ABSTRACT In this

More information

CHAPTER 4 LINK ADAPTATION USING NEURAL NETWORK

CHAPTER 4 LINK ADAPTATION USING NEURAL NETWORK CHAPTER 4 LINK ADAPTATION USING NEURAL NETWORK 4.1 INTRODUCTION For accurate system level simulator performance, link level modeling and prediction [103] must be reliable and fast so as to improve the

More information

258 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART B: CYBERNETICS, VOL. 33, NO. 2, APRIL 2003

258 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART B: CYBERNETICS, VOL. 33, NO. 2, APRIL 2003 258 IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS PART B: CYBERNETICS, VOL. 33, NO. 2, APRIL 2003 Genetic Design of Biologically Inspired Receptive Fields for Neural Pattern Recognition Claudio A.

More information

Initialisation improvement in engineering feedforward ANN models.

Initialisation improvement in engineering feedforward ANN models. Initialisation improvement in engineering feedforward ANN models. A. Krimpenis and G.-C. Vosniakos National Technical University of Athens, School of Mechanical Engineering, Manufacturing Technology Division,

More information

Stacking Ensemble for auto ml

Stacking Ensemble for auto ml Stacking Ensemble for auto ml Khai T. Ngo Thesis submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of Master

More information

Stock Market Indices Prediction Using Time Series Analysis

Stock Market Indices Prediction Using Time Series Analysis Stock Market Indices Prediction Using Time Series Analysis ALINA BĂRBULESCU Department of Mathematics and Computer Science Ovidius University of Constanța 124, Mamaia Bd., 900524, Constanța ROMANIA alinadumitriu@yahoo.com

More information

Behavior Emergence in Autonomous Robot Control by Means of Feedforward and Recurrent Neural Networks

Behavior Emergence in Autonomous Robot Control by Means of Feedforward and Recurrent Neural Networks Behavior Emergence in Autonomous Robot Control by Means of Feedforward and Recurrent Neural Networks Stanislav Slušný, Petra Vidnerová, Roman Neruda Abstract We study the emergence of intelligent behavior

More information

The Automatic Classification Problem. Perceptrons, SVMs, and Friends: Some Discriminative Models for Classification

The Automatic Classification Problem. Perceptrons, SVMs, and Friends: Some Discriminative Models for Classification Perceptrons, SVMs, and Friends: Some Discriminative Models for Classification Parallel to AIMA 8., 8., 8.6.3, 8.9 The Automatic Classification Problem Assign object/event or sequence of objects/events

More information

CS 441/541 Artificial Intelligence Fall, Homework 6: Genetic Algorithms. Due Monday Nov. 24.

CS 441/541 Artificial Intelligence Fall, Homework 6: Genetic Algorithms. Due Monday Nov. 24. CS 441/541 Artificial Intelligence Fall, 2008 Homework 6: Genetic Algorithms Due Monday Nov. 24. In this assignment you will code and experiment with a genetic algorithm as a method for evolving control

More information

TJHSST Senior Research Project Evolving Motor Techniques for Artificial Life

TJHSST Senior Research Project Evolving Motor Techniques for Artificial Life TJHSST Senior Research Project Evolving Motor Techniques for Artificial Life 2007-2008 Kelley Hecker November 2, 2007 Abstract This project simulates evolving virtual creatures in a 3D environment, based

More information

Transactions on Information and Communications Technologies vol 1, 1993 WIT Press, ISSN

Transactions on Information and Communications Technologies vol 1, 1993 WIT Press,   ISSN Combining multi-layer perceptrons with heuristics for reliable control chart pattern classification D.T. Pham & E. Oztemel Intelligent Systems Research Laboratory, School of Electrical, Electronic and

More information

Artificial Neural Network Modeling and Optimization using Genetic Algorithm of Machining Process

Artificial Neural Network Modeling and Optimization using Genetic Algorithm of Machining Process Journal of Automation and Control Engineering Vol., No. 4, December 4 Artificial Neural Network Modeling and Optimization using Genetic Algorithm of Machining Process Pragya Shandilya Motilal Nehru National

More information

Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors

Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors Int. J. Advanced Networking and Applications 1053 Using of Artificial Neural Networks to Recognize the Noisy Accidents Patterns of Nuclear Research Reactors Eng. Abdelfattah A. Ahmed Atomic Energy Authority,

More information

Training a Neural Network for Checkers

Training a Neural Network for Checkers Training a Neural Network for Checkers Daniel Boonzaaier Supervisor: Adiel Ismail June 2017 Thesis presented in fulfilment of the requirements for the degree of Bachelor of Science in Honours at the University

More information

Eur Ing Dr. Lei Zhang Faculty of Engineering and Applied Science University of Regina Canada

Eur Ing Dr. Lei Zhang Faculty of Engineering and Applied Science University of Regina Canada Eur Ing Dr. Lei Zhang Faculty of Engineering and Applied Science University of Regina Canada The Second International Conference on Neuroscience and Cognitive Brain Information BRAININFO 2017, July 22,

More information

CPS331 Lecture: Genetic Algorithms last revised October 28, 2016

CPS331 Lecture: Genetic Algorithms last revised October 28, 2016 CPS331 Lecture: Genetic Algorithms last revised October 28, 2016 Objectives: 1. To explain the basic ideas of GA/GP: evolution of a population; fitness, crossover, mutation Materials: 1. Genetic NIM learner

More information

GPU Computing for Cognitive Robotics

GPU Computing for Cognitive Robotics GPU Computing for Cognitive Robotics Martin Peniak, Davide Marocco, Angelo Cangelosi GPU Technology Conference, San Jose, California, 25 March, 2014 Acknowledgements This study was financed by: EU Integrating

More information

Prediction of Rock Fragmentation in Open Pit Mines, using Neural Network Analysis

Prediction of Rock Fragmentation in Open Pit Mines, using Neural Network Analysis Prediction of Rock Fragmentation in Open Pit Mines, using Neural Network Analysis Kazem Oraee 1, Bahareh Asi 2 Loading and transport costs constitute up to 50% of the total operational costs in open pit

More information

CHAPTER 6 NEURO-FUZZY CONTROL OF TWO-STAGE KY BOOST CONVERTER

CHAPTER 6 NEURO-FUZZY CONTROL OF TWO-STAGE KY BOOST CONVERTER 73 CHAPTER 6 NEURO-FUZZY CONTROL OF TWO-STAGE KY BOOST CONVERTER 6.1 INTRODUCTION TO NEURO-FUZZY CONTROL The block diagram in Figure 6.1 shows the Neuro-Fuzzy controlling technique employed to control

More information

Achieving Desirable Gameplay Objectives by Niched Evolution of Game Parameters

Achieving Desirable Gameplay Objectives by Niched Evolution of Game Parameters Achieving Desirable Gameplay Objectives by Niched Evolution of Game Parameters Scott Watson, Andrew Vardy, Wolfgang Banzhaf Department of Computer Science Memorial University of Newfoundland St John s.

More information

ANALYSIS OF CITIES DATA USING PRINCIPAL COMPONENT INPUTS IN AN ARTIFICIAL NEURAL NETWORK

ANALYSIS OF CITIES DATA USING PRINCIPAL COMPONENT INPUTS IN AN ARTIFICIAL NEURAL NETWORK DOI: http://dx.doi.org/10.7708/ijtte.2018.8(3).02 UDC: 004.8.032.26 ANALYSIS OF CITIES DATA USING PRINCIPAL COMPONENT INPUTS IN AN ARTIFICIAL NEURAL NETWORK Villuri Mahalakshmi Naidu 1, Chekuri Siva Rama

More information

Diet Networks: Thin Parameters for Fat Genomics

Diet Networks: Thin Parameters for Fat Genomics Institut des algorithmes d apprentissage de Montréal Diet Networks: Thin Parameters for Fat Genomics Adriana Romero, Pierre Luc Carrier, Akram Erraqabi, Tristan Sylvain, Alex Auvolat, Etienne Dejoie, Marc-André

More information

1 Introduction. w k x k (1.1)

1 Introduction. w k x k (1.1) Neural Smithing 1 Introduction Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The major

More information

Evolutionary Optimization for the Channel Assignment Problem in Wireless Mobile Network

Evolutionary Optimization for the Channel Assignment Problem in Wireless Mobile Network (649 -- 917) Evolutionary Optimization for the Channel Assignment Problem in Wireless Mobile Network Y.S. Chia, Z.W. Siew, S.S. Yang, H.T. Yew, K.T.K. Teo Modelling, Simulation and Computing Laboratory

More information

FUZZY EXPERT SYSTEM FOR DIABETES USING REINFORCED FUZZY ASSESSMENT MECHANISMS M.KALPANA

FUZZY EXPERT SYSTEM FOR DIABETES USING REINFORCED FUZZY ASSESSMENT MECHANISMS M.KALPANA FUZZY EXPERT SYSTEM FOR DIABETES USING REINFORCED FUZZY ASSESSMENT MECHANISMS Thesis Submitted to the BHARATHIAR UNIVERSITY in partial fulfillment of the requirements for the award of the Degree of DOCTOR

More information

Taylor Barto* Department of Electrical and Computer Engineering Cleveland State University Cleveland, Ohio December 2, 2014

Taylor Barto* Department of Electrical and Computer Engineering Cleveland State University Cleveland, Ohio December 2, 2014 PID vs. Artificial Neural Network Control of an H-Bridge Voltage Source Converter Abstract Taylor Barto* Department of Electrical and Computer Engineering Cleveland State University Cleveland, Ohio 44115

More information

Optimization of Tile Sets for DNA Self- Assembly

Optimization of Tile Sets for DNA Self- Assembly Optimization of Tile Sets for DNA Self- Assembly Joel Gawarecki Department of Computer Science Simpson College Indianola, IA 50125 joel.gawarecki@my.simpson.edu Adam Smith Department of Computer Science

More information

Artificial Neural Networks

Artificial Neural Networks Artificial Neural Networks ABSTRACT Just as life attempts to understand itself better by modeling it, and in the process create something new, so Neural computing is an attempt at modeling the workings

More information

Prediction of Cluster System Load Using Artificial Neural Networks

Prediction of Cluster System Load Using Artificial Neural Networks Prediction of Cluster System Load Using Artificial Neural Networks Y.S. Artamonov 1 1 Samara National Research University, 34 Moskovskoe Shosse, 443086, Samara, Russia Abstract Currently, a wide range

More information

Publication P IEEE. Reprinted with permission.

Publication P IEEE. Reprinted with permission. P3 Publication P3 J. Martikainen and S. J. Ovaska function approximation by neural networks in the optimization of MGP-FIR filters in Proc. of the IEEE Mountain Workshop on Adaptive and Learning Systems

More information

Comparing The Performance Of MLP With One Hidden Layer And MLP With Two Hidden Layers On Mammography Mass Dataset

Comparing The Performance Of MLP With One Hidden Layer And MLP With Two Hidden Layers On Mammography Mass Dataset Comparing The Performance Of MLP With One Hidden Layer And MLP With Two Hidden Layers On Mammography Mass Dataset Venu Azad Department of Computer Science, Govt. girls P.G. College Sec 14, Gurgaon, Haryana,

More information

Comparison of MLP and RBF neural networks for Prediction of ECG Signals

Comparison of MLP and RBF neural networks for Prediction of ECG Signals 124 Comparison of MLP and RBF neural networks for Prediction of ECG Signals Ali Sadr 1, Najmeh Mohsenifar 2, Raziyeh Sadat Okhovat 3 Department Of electrical engineering Iran University of Science and

More information

APPLICATION OF NEURAL NETWORK TRAINED WITH META-HEURISTIC ALGORITHMS ON FAULT DIAGNOSIS OF MULTI-LEVEL INVERTER

APPLICATION OF NEURAL NETWORK TRAINED WITH META-HEURISTIC ALGORITHMS ON FAULT DIAGNOSIS OF MULTI-LEVEL INVERTER APPLICATION OF NEURAL NETWORK TRAINED WITH META-HEURISTIC ALGORITHMS ON FAULT DIAGNOSIS OF MULTI-LEVEL INVERTER 1 M.SIVAKUMAR, 2 R.M.S.PARVATHI 1 Research Scholar, Department of EEE, Anna University, Chennai,

More information

CSC 396 : Introduction to Artificial Intelligence

CSC 396 : Introduction to Artificial Intelligence CSC 396 : Introduction to Artificial Intelligence Exam 1 March 11th - 13th, 2008 Name Signature - Honor Code This is a take-home exam. You may use your book and lecture notes from class. You many not use

More information

Generating an appropriate sound for a video using WaveNet.

Generating an appropriate sound for a video using WaveNet. Australian National University College of Engineering and Computer Science Master of Computing Generating an appropriate sound for a video using WaveNet. COMP 8715 Individual Computing Project Taku Ueki

More information

Meta-Heuristic Approach for Supporting Design-for- Disassembly towards Efficient Material Utilization

Meta-Heuristic Approach for Supporting Design-for- Disassembly towards Efficient Material Utilization Meta-Heuristic Approach for Supporting Design-for- Disassembly towards Efficient Material Utilization Yoshiaki Shimizu *, Kyohei Tsuji and Masayuki Nomura Production Systems Engineering Toyohashi University

More information

Behaviour Patterns Evolution on Individual and Group Level. Stanislav Slušný, Roman Neruda, Petra Vidnerová. CIMMACS 07, December 14, Tenerife

Behaviour Patterns Evolution on Individual and Group Level. Stanislav Slušný, Roman Neruda, Petra Vidnerová. CIMMACS 07, December 14, Tenerife Behaviour Patterns Evolution on Individual and Group Level Stanislav Slušný, Roman Neruda, Petra Vidnerová Department of Theoretical Computer Science Institute of Computer Science Academy of Science of

More information

Vol. 2, No. 6, July 2012 ISSN ARPN Journal of Science and Technology All rights reserved.

Vol. 2, No. 6, July 2012 ISSN ARPN Journal of Science and Technology All rights reserved. Vol., No. 6, July 0 ISSN 5-77 0-0. All rights reserved. Artificial Neuron Based Models for Estimating Shelf Life of Burfi Sumit Goyal, Gyanendra Kumar Goyal, National Dairy Research Institute, Karnal-300

More information

DETECTION AND CLASSIFICATION OF POWER QUALITY DISTURBANCES

DETECTION AND CLASSIFICATION OF POWER QUALITY DISTURBANCES DETECTION AND CLASSIFICATION OF POWER QUALITY DISTURBANCES Ph.D. THESIS by UTKARSH SINGH INDIAN INSTITUTE OF TECHNOLOGY ROORKEE ROORKEE-247 667 (INDIA) OCTOBER, 2017 DETECTION AND CLASSIFICATION OF POWER

More information

Wire Layer Geometry Optimization using Stochastic Wire Sampling

Wire Layer Geometry Optimization using Stochastic Wire Sampling Wire Layer Geometry Optimization using Stochastic Wire Sampling Raymond A. Wildman*, Joshua I. Kramer, Daniel S. Weile, and Philip Christie Department University of Delaware Introduction Is it possible

More information

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron Proc. National Conference on Recent Trends in Intelligent Computing (2006) 86-92 A comparative study of different feature sets for recognition of handwritten Arabic numerals using a Multi Layer Perceptron

More information

Artificial neural networks in forecasting tourists flow, an intelligent technique to help the economic development of tourism in Albania.

Artificial neural networks in forecasting tourists flow, an intelligent technique to help the economic development of tourism in Albania. Artificial neural networks in forecasting tourists flow, an intelligent technique to help the economic development of tourism in Albania. Dezdemona Gjylapi, MSc, PhD Candidate University Pavaresia Vlore,

More information

Application Of Artificial Neural Network In Fault Detection Of Hvdc Converter

Application Of Artificial Neural Network In Fault Detection Of Hvdc Converter Application Of Artificial Neural Network In Fault Detection Of Hvdc Converter Madhuri S Shastrakar Department of Electrical Engineering, Shree Ramdeobaba College of Engineering and Management, Nagpur,

More information

MATLAB/GUI Simulation Tool for Power System Fault Analysis with Neural Network Fault Classifier

MATLAB/GUI Simulation Tool for Power System Fault Analysis with Neural Network Fault Classifier MATLAB/GUI Simulation Tool for Power System Fault Analysis with Neural Network Fault Classifier Ph Chitaranjan Sharma, Ishaan Pandiya, Dipak Swargari, Kusum Dangi * Department of Electrical Engineering,

More information

Biologically Inspired Embodied Evolution of Survival

Biologically Inspired Embodied Evolution of Survival Biologically Inspired Embodied Evolution of Survival Stefan Elfwing 1,2 Eiji Uchibe 2 Kenji Doya 2 Henrik I. Christensen 1 1 Centre for Autonomous Systems, Numerical Analysis and Computer Science, Royal

More information

2. Simulated Based Evolutionary Heuristic Methodology

2. Simulated Based Evolutionary Heuristic Methodology XXVII SIM - South Symposium on Microelectronics 1 Simulation-Based Evolutionary Heuristic to Sizing Analog Integrated Circuits Lucas Compassi Severo, Alessandro Girardi {lucassevero, alessandro.girardi}@unipampa.edu.br

More information

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY

SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY SMARTPHONE SENSOR BASED GESTURE RECOGNITION LIBRARY Sidhesh Badrinarayan 1, Saurabh Abhale 2 1,2 Department of Information Technology, Pune Institute of Computer Technology, Pune, India ABSTRACT: Gestures

More information

Creating a Dominion AI Using Genetic Algorithms

Creating a Dominion AI Using Genetic Algorithms Creating a Dominion AI Using Genetic Algorithms Abstract Mok Ming Foong Dominion is a deck-building card game. It allows for complex strategies, has an aspect of randomness in card drawing, and no obvious

More information

COMP SCI 5401 FS2015 A Genetic Programming Approach for Ms. Pac-Man

COMP SCI 5401 FS2015 A Genetic Programming Approach for Ms. Pac-Man COMP SCI 5401 FS2015 A Genetic Programming Approach for Ms. Pac-Man Daniel Tauritz, Ph.D. November 17, 2015 Synopsis The goal of this assignment set is for you to become familiarized with (I) unambiguously

More information

CONSTRUCTION COST PREDICTION USING NEURAL NETWORKS

CONSTRUCTION COST PREDICTION USING NEURAL NETWORKS ISSN: 9-9 (ONLINE) ICTACT JOURNAL ON SOFT COMPUTING, OCTOBER 7, VOLUME: 8, ISSUE: DOI:.97/ijsc.7. CONSTRUCTION COST PREDICTION USING NEURAL NETWORKS Smita K. Magdum and Amol C. Adamuthe Department of Computer

More information

Representation Learning for Mobile Robots in Dynamic Environments

Representation Learning for Mobile Robots in Dynamic Environments Representation Learning for Mobile Robots in Dynamic Environments Olivia Michael Supervised by A/Prof. Oliver Obst Western Sydney University Vacation Research Scholarships are funded jointly by the Department

More information

A Numerical Approach to Understanding Oscillator Neural Networks

A Numerical Approach to Understanding Oscillator Neural Networks A Numerical Approach to Understanding Oscillator Neural Networks Natalie Klein Mentored by Jon Wilkins Networks of coupled oscillators are a form of dynamical network originally inspired by various biological

More information

ARTIFICIAL INTELLIGENCE IN POWER SYSTEMS

ARTIFICIAL INTELLIGENCE IN POWER SYSTEMS ARTIFICIAL INTELLIGENCE IN POWER SYSTEMS Prof.Somashekara Reddy 1, Kusuma S 2 1 Department of MCA, NHCE Bangalore, India 2 Kusuma S, Department of MCA, NHCE Bangalore, India Abstract: Artificial Intelligence

More information

A Novel Multistage Genetic Algorithm Approach for Solving Sudoku Puzzle

A Novel Multistage Genetic Algorithm Approach for Solving Sudoku Puzzle A Novel Multistage Genetic Algorithm Approach for Solving Sudoku Puzzle Haradhan chel, Deepak Mylavarapu 2 and Deepak Sharma 2 Central Institute of Technology Kokrajhar,Kokrajhar, BTAD, Assam, India, PIN-783370

More information

Available online Journal of Scientific and Engineering Research, 2018, 5(5): Review Article

Available online   Journal of Scientific and Engineering Research, 2018, 5(5): Review Article Available online www.saer.com, 2018, 5(5):471-479 Review Article ISSN: 2394-2630 CODEN(USA): JSERBR BBO Tuned PI Control for Three Phase Rectifier Salam Waley Shneen Energy and Renewable Energies Technology

More information

Solving and Analyzing Sudokus with Cultural Algorithms 5/30/2008. Timo Mantere & Janne Koljonen

Solving and Analyzing Sudokus with Cultural Algorithms 5/30/2008. Timo Mantere & Janne Koljonen with Cultural Algorithms Timo Mantere & Janne Koljonen University of Vaasa Department of Electrical Engineering and Automation P.O. Box, FIN- Vaasa, Finland timan@uwasa.fi & jako@uwasa.fi www.uwasa.fi/~timan/sudoku

More information

A Hybrid Neuro Genetic Approach for Analyzing Dissolved Gases in Power Transformers

A Hybrid Neuro Genetic Approach for Analyzing Dissolved Gases in Power Transformers A Hybrid Neuro Genetic Approach for Analyzing Dissolved Gases in Power Transformers Alamuru Vani. 1, Dr. Pessapaty Sree Rama Chandra Murthy 2 Associate Professor, Department of Electrical Engineering,

More information

Optimizing the State Evaluation Heuristic of Abalone using Evolutionary Algorithms

Optimizing the State Evaluation Heuristic of Abalone using Evolutionary Algorithms Optimizing the State Evaluation Heuristic of Abalone using Evolutionary Algorithms Benjamin Rhew December 1, 2005 1 Introduction Heuristics are used in many applications today, from speech recognition

More information

LANDSCAPE SMOOTHING OF NUMERICAL PERMUTATION SPACES IN GENETIC ALGORITHMS

LANDSCAPE SMOOTHING OF NUMERICAL PERMUTATION SPACES IN GENETIC ALGORITHMS LANDSCAPE SMOOTHING OF NUMERICAL PERMUTATION SPACES IN GENETIC ALGORITHMS ABSTRACT The recent popularity of genetic algorithms (GA s) and their application to a wide range of problems is a result of their

More information

Shuffled Complex Evolution

Shuffled Complex Evolution Shuffled Complex Evolution Shuffled Complex Evolution An Evolutionary algorithm That performs local and global search A solution evolves locally through a memetic evolution (Local search) This local search

More information

IMPLEMENTATION OF NEURAL NETWORK IN ENERGY SAVING OF INDUCTION MOTOR DRIVES WITH INDIRECT VECTOR CONTROL

IMPLEMENTATION OF NEURAL NETWORK IN ENERGY SAVING OF INDUCTION MOTOR DRIVES WITH INDIRECT VECTOR CONTROL IMPLEMENTATION OF NEURAL NETWORK IN ENERGY SAVING OF INDUCTION MOTOR DRIVES WITH INDIRECT VECTOR CONTROL * A. K. Sharma, ** R. A. Gupta, and *** Laxmi Srivastava * Department of Electrical Engineering,

More information

Coevolution and turnbased games

Coevolution and turnbased games Spring 5 Coevolution and turnbased games A case study Joakim Långberg HS-IKI-EA-05-112 [Coevolution and turnbased games] Submitted by Joakim Långberg to the University of Skövde as a dissertation towards

More information

Adaptive Hybrid Channel Assignment in Wireless Mobile Network via Genetic Algorithm

Adaptive Hybrid Channel Assignment in Wireless Mobile Network via Genetic Algorithm Adaptive Hybrid Channel Assignment in Wireless Mobile Network via Genetic Algorithm Y.S. Chia Z.W. Siew A. Kiring S.S. Yang K.T.K. Teo Modelling, Simulation and Computing Laboratory School of Engineering

More information

MAGNT Research Report (ISSN ) Vol.6(1). PP , Controlling Cost and Time of Construction Projects Using Neural Network

MAGNT Research Report (ISSN ) Vol.6(1). PP , Controlling Cost and Time of Construction Projects Using Neural Network Controlling Cost and Time of Construction Projects Using Neural Network Li Ping Lo Faculty of Computer Science and Engineering Beijing University China Abstract In order to achieve optimized management,

More information

Contents 1 Introduction Optical Character Recognition Systems Soft Computing Techniques for Optical Character Recognition Systems

Contents 1 Introduction Optical Character Recognition Systems Soft Computing Techniques for Optical Character Recognition Systems Contents 1 Introduction.... 1 1.1 Organization of the Monograph.... 1 1.2 Notation.... 3 1.3 State of Art.... 4 1.4 Research Issues and Challenges.... 5 1.5 Figures.... 5 1.6 MATLAB OCR Toolbox.... 5 References....

More information

Evolutionary Design of Multilayer and Radial Basis Function Neural Network Classifiers: an Empirical Comparison

Evolutionary Design of Multilayer and Radial Basis Function Neural Network Classifiers: an Empirical Comparison 86 IJCSNS International Journal of Computer Science and Network Security, VOL.16 No.6, June 2016 Evolutionary Design of Multilayer and Radial Basis Function Neural Network Classifiers: an Empirical Comparison

More information

Prediction of Breathing Patterns Using Neural Networks

Prediction of Breathing Patterns Using Neural Networks Virginia Commonwealth University VCU Scholars Compass Theses and Dissertations Graduate School 2008 Prediction of Breathing Patterns Using Neural Networks Pavani Davuluri Virginia Commonwealth University

More information

FINANCIAL TIME SERIES FORECASTING USING A HYBRID NEURAL- EVOLUTIVE APPROACH

FINANCIAL TIME SERIES FORECASTING USING A HYBRID NEURAL- EVOLUTIVE APPROACH FINANCIAL TIME SERIES FORECASTING USING A HYBRID NEURAL- EVOLUTIVE APPROACH JUAN J. FLORES 1, ROBERTO LOAEZA 1, HECTOR RODRIGUEZ 1, FEDERICO GONZALEZ 2, BEATRIZ FLORES 2, ANTONIO TERCEÑO GÓMEZ 3 1 Division

More information

Transient stability Assessment using Artificial Neural Network Considering Fault Location

Transient stability Assessment using Artificial Neural Network Considering Fault Location Vol.6 No., 200 مجلد 6, العدد, 200 Proc. st International Conf. Energy, Power and Control Basrah University, Basrah, Iraq 0 Nov. to 2 Dec. 200 Transient stability Assessment using Artificial Neural Network

More information

Session 124TS, A Practical Guide to Machine Learning for Actuaries. Presenters: Dave M. Liner, FSA, MAAA, CERA

Session 124TS, A Practical Guide to Machine Learning for Actuaries. Presenters: Dave M. Liner, FSA, MAAA, CERA Session 124TS, A Practical Guide to Machine Learning for Actuaries Presenters: Dave M. Liner, FSA, MAAA, CERA SOA Antitrust Disclaimer SOA Presentation Disclaimer A practical guide to machine learning

More information

Introduction to Genetic Algorithms

Introduction to Genetic Algorithms Introduction to Genetic Algorithms Peter G. Anderson, Computer Science Department Rochester Institute of Technology, Rochester, New York anderson@cs.rit.edu http://www.cs.rit.edu/ February 2004 pg. 1 Abstract

More information

DRILLING RATE OF PENETRATION PREDICTION USING ARTIFICIAL NEURAL NETWORK: A CASE STUDY OF ONE OF IRANIAN SOUTHERN OIL FIELDS

DRILLING RATE OF PENETRATION PREDICTION USING ARTIFICIAL NEURAL NETWORK: A CASE STUDY OF ONE OF IRANIAN SOUTHERN OIL FIELDS 21 UDC 622.244.6.05:681.3.06. DRILLING RATE OF PENETRATION PREDICTION USING ARTIFICIAL NEURAL NETWORK: A CASE STUDY OF ONE OF IRANIAN SOUTHERN OIL FIELDS Mehran Monazami MSc Student, Ahwaz Faculty of Petroleum,

More information

Exercise 4 Exploring Population Change without Selection

Exercise 4 Exploring Population Change without Selection Exercise 4 Exploring Population Change without Selection This experiment began with nine Avidian ancestors of identical fitness; the mutation rate is zero percent. Since descendants can never differ in

More information

Genetic Algorithms with Heuristic Knight s Tour Problem

Genetic Algorithms with Heuristic Knight s Tour Problem Genetic Algorithms with Heuristic Knight s Tour Problem Jafar Al-Gharaibeh Computer Department University of Idaho Moscow, Idaho, USA Zakariya Qawagneh Computer Department Jordan University for Science

More information

Classification Experiments for Number Plate Recognition Data Set Using Weka

Classification Experiments for Number Plate Recognition Data Set Using Weka Classification Experiments for Number Plate Recognition Data Set Using Weka Atul Kumar 1, Sunila Godara 2 1 Department of Computer Science and Engineering Guru Jambheshwar University of Science and Technology

More information

DISTRIBUTED BIOGEOGRAPHY BASED OPTIMIZATION FOR MOBILE ROBOTS

DISTRIBUTED BIOGEOGRAPHY BASED OPTIMIZATION FOR MOBILE ROBOTS DISTRIBUTED BIOGEOGRAPHY BASED OPTIMIZATION FOR MOBILE ROBOTS Arpit Shah Bachelor of Science in Electronics and Communication Engineering HNGU North Gujarat University May, 2009 Submitted in partial fulfillment

More information

Use of Neural Networks in Testing Analog to Digital Converters

Use of Neural Networks in Testing Analog to Digital Converters Use of Neural s in Testing Analog to Digital Converters K. MOHAMMADI, S. J. SEYYED MAHDAVI Department of Electrical Engineering Iran University of Science and Technology Narmak, 6844, Tehran, Iran Abstract:

More information

CandyCrush.ai: An AI Agent for Candy Crush

CandyCrush.ai: An AI Agent for Candy Crush CandyCrush.ai: An AI Agent for Candy Crush Jiwoo Lee, Niranjan Balachandar, Karan Singhal December 16, 2016 1 Introduction Candy Crush, a mobile puzzle game, has become very popular in the past few years.

More information

Available online at ScienceDirect. Procedia Computer Science 85 (2016 )

Available online at   ScienceDirect. Procedia Computer Science 85 (2016 ) Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 85 (2016 ) 263 270 International Conference on Computational Modeling and Security (CMS 2016) Proposing Solution to XOR

More information