**Commenced**in January 2007

**Frequency:**Monthly

**Edition:**International

**Paper Count:**2903

# Search results for: neural network

##### 2903 A Combined Neural Network Approach to Soccer Player Prediction

**Authors:**
Wenbin Zhang,
Hantian Wu,
Jian Tang

**Abstract:**

An artificial neural network is a mathematical model inspired by biological neural networks. There are several kinds of neural networks and they are widely used in many areas, such as: prediction, detection, and classification. Meanwhile, in day to day life, people always have to make many difficult decisions. For example, the coach of a soccer club has to decide which offensive player to be selected to play in a certain game. This work describes a novel Neural Network using a combination of the General Regression Neural Network and the Probabilistic Neural Networks to help a soccer coach make an informed decision.

**Keywords:**
General Regression Neural Network,
Probabilistic Neural Networks,
Neural function.

##### 2902 A Literature Survey of Neural Network Applications for Shunt Active Power Filters

**Authors:**
S. Janpong,
K-L. Areerak,
K-N. Areerak

**Abstract:**

**Keywords:**
Active power filter,
neural network,
harmonic
distortion,
harmonic detection and compensation,
non-linear load.

##### 2901 Applications of Cascade Correlation Neural Networks for Cipher System Identification

**Authors:**
B. Chandra,
P. Paul Varghese

**Abstract:**

Crypto System Identification is one of the challenging tasks in Crypt analysis. The paper discusses the possibility of employing Neural Networks for identification of Cipher Systems from cipher texts. Cascade Correlation Neural Network and Back Propagation Network have been employed for identification of Cipher Systems. Very large collection of cipher texts were generated using a Block Cipher (Enhanced RC6) and a Stream Cipher (SEAL). Promising results were obtained in terms of accuracy using both the Neural Network models but it was observed that the Cascade Correlation Neural Network Model performed better compared to Back Propagation Network.

**Keywords:**
Back Propagation Neural Networks,
CascadeCorrelation Neural Network,
Crypto systems,
Block Cipher,
StreamCipher.

##### 2900 Spline Basis Neural Network Algorithm for Numerical Integration

**Authors:**
Lina Yan,
Jingjing Di,
Ke Wang

**Abstract:**

A new basis function neural network algorithm is proposed for numerical integration. The main idea is to construct neural network model based on spline basis functions, which is used to approximate the integrand by training neural network weights. The convergence theorem of the neural network algorithm, the theorem for numerical integration and one corollary are presented and proved. The numerical examples, compared with other methods, show that the algorithm is effective and has the characteristics such as high precision and the integrand not required known. Thus, the algorithm presented in this paper can be widely applied in many engineering fields.

**Keywords:**
Numerical integration,
Spline basis function,
Neural
network algorithm

##### 2899 Investigation of Artificial Neural Networks Performance to Predict Net Heating Value of Crude Oil by Its Properties

**Authors:**
Mousavian,
M. Moghimi Mofrad,
M. H. Vakili,
D. Ashouri,
R. Alizadeh

**Abstract:**

The aim of this research is to use artificial neural networks computing technology for estimating the net heating value (NHV) of crude oil by its Properties. The approach is based on training the neural network simulator uses back-propagation as the learning algorithm for a predefined range of analytically generated well test response. The network with 8 neurons in one hidden layer was selected and prediction of this network has been good agreement with experimental data.

**Keywords:**
Neural Network,
Net Heating Value,
Crude Oil,
Experimental,
Modeling.

##### 2898 Avoiding Catastrophic Forgetting by a Dual-Network Memory Model Using a Chaotic Neural Network

**Authors:**
Motonobu Hattori

**Abstract:**

In neural networks, when new patterns are learned by a network, the new information radically interferes with previously stored patterns. This drawback is called catastrophic forgetting or catastrophic interference. In this paper, we propose a biologically inspired neural network model which overcomes this problem. The proposed model consists of two distinct networks: one is a Hopfield type of chaotic associative memory and the other is a multilayer neural network. We consider that these networks correspond to the hippocampus and the neocortex of the brain, respectively. Information given is firstly stored in the hippocampal network with fast learning algorithm. Then the stored information is recalled by chaotic behavior of each neuron in the hippocampal network. Finally, it is consolidated in the neocortical network by using pseudopatterns. Computer simulation results show that the proposed model has much better ability to avoid catastrophic forgetting in comparison with conventional models.

**Keywords:**
catastrophic forgetting,
chaotic neural network,
complementary learning systems,
dual-network

##### 2897 Optimum Neural Network Architecture for Precipitation Prediction of Myanmar

**Authors:**
Khaing Win Mar,
Thinn Thu Naing

**Abstract:**

Nowadays, precipitation prediction is required for proper planning and management of water resources. Prediction with neural network models has received increasing interest in various research and application domains. However, it is difficult to determine the best neural network architecture for prediction since it is not immediately obvious how many input or hidden nodes are used in the model. In this paper, neural network model is used as a forecasting tool. The major aim is to evaluate a suitable neural network model for monthly precipitation mapping of Myanmar. Using 3-layerd neural network models, 100 cases are tested by changing the number of input and hidden nodes from 1 to 10 nodes, respectively, and only one outputnode used. The optimum model with the suitable number of nodes is selected in accordance with the minimum forecast error. In measuring network performance using Root Mean Square Error (RMSE), experimental results significantly show that 3 inputs-10 hiddens-1 output architecture model gives the best prediction result for monthly precipitation in Myanmar.

**Keywords:**
Precipitation prediction,
monthly precipitation,
neural network models,
Myanmar.

##### 2896 Some Remarkable Properties of a Hopfield Neural Network with Time Delay

**Authors:**
Kelvin Rozier,
Vladimir E. Bondarenko

**Abstract:**

**Keywords:**
Chaos,
Hopfield neural network,
noise,
synchronization

##### 2895 Development of Gas Chromatography Model: Propylene Concentration Using Neural Network

**Authors:**
Areej Babiker Idris Babiker,
Rosdiazli Ibrahim

**Abstract:**

**Keywords:**
Analyzer,
Levenberg-Marquardt,
Gas
chromatography,
Neural network

##### 2894 Efficient System for Speech Recognition using General Regression Neural Network

**Authors:**
Abderrahmane Amrouche,
Jean Michel Rouvaen

**Abstract:**

**Keywords:**
Speech Recognition,
General Regression NeuralNetwork,
Hidden Markov Model,
Recurrent Neural Network,
ArabicDigits.

##### 2893 Identify Features and Parameters to Devise an Accurate Intrusion Detection System Using Artificial Neural Network

**Authors:**
Saman M. Abdulla,
Najla B. Al-Dabagh,
Omar Zakaria

**Abstract:**

The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.

**Keywords:**
Artificial Neural Network,
Attack Features,
MisuseIntrusion Detection System,
Training Parameters.

##### 2892 Complex-Valued Neural Network in Image Recognition: A Study on the Effectiveness of Radial Basis Function

**Authors:**
Anupama Pande,
Vishik Goel

**Abstract:**

A complex valued neural network is a neural network, which consists of complex valued input and/or weights and/or thresholds and/or activation functions. Complex-valued neural networks have been widening the scope of applications not only in electronics and informatics, but also in social systems. One of the most important applications of the complex valued neural network is in image and vision processing. In Neural networks, radial basis functions are often used for interpolation in multidimensional space. A Radial Basis function is a function, which has built into it a distance criterion with respect to a centre. Radial basis functions have often been applied in the area of neural networks where they may be used as a replacement for the sigmoid hidden layer transfer characteristic in multi-layer perceptron. This paper aims to present exhaustive results of using RBF units in a complex-valued neural network model that uses the back-propagation algorithm (called 'Complex-BP') for learning. Our experiments results demonstrate the effectiveness of a Radial basis function in a complex valued neural network in image recognition over a real valued neural network. We have studied and stated various observations like effect of learning rates, ranges of the initial weights randomly selected, error functions used and number of iterations for the convergence of error on a neural network model with RBF units. Some inherent properties of this complex back propagation algorithm are also studied and discussed.

**Keywords:**
Complex valued neural network,
Radial BasisFunction,
Image recognition.

##### 2891 Application of Neural Networks in Financial Data Mining

**Authors:**
Defu Zhang,
Qingshan Jiang,
Xin Li

**Abstract:**

This paper deals with the application of a well-known neural network technique, multilayer back-propagation (BP) neural network, in financial data mining. A modified neural network forecasting model is presented, and an intelligent mining system is developed. The system can forecast the buying and selling signs according to the prediction of future trends to stock market, and provide decision-making for stock investors. The simulation result of seven years to Shanghai Composite Index shows that the return achieved by this mining system is about three times as large as that achieved by the buy and hold strategy, so it is advantageous to apply neural networks to forecast financial time series, the different investors could benefit from it.

**Keywords:**
Data mining,
neural network,
stock forecasting.

##### 2890 Complex-Valued Neural Network in Signal Processing: A Study on the Effectiveness of Complex Valued Generalized Mean Neuron Model

**Authors:**
Anupama Pande,
Ashok Kumar Thakur,
Swapnoneel Roy

**Abstract:**

**Keywords:**
Complex valued neural network,
Generalized Meanneuron model,
Signal processing.

##### 2889 Bayesian Deep Learning Algorithms for Classifying COVID-19 Images

**Authors:**
I. Oloyede

**Abstract:**

**Keywords:**
BCNN,
CNN,
Images,
COVID-19,
Deep Learning.

##### 2888 Accelerating Integer Neural Networks On Low Cost DSPs

**Authors:**
Thomas Behan,
Zaiyi Liao,
Lian Zhao,
Chunting Yang

**Abstract:**

**Keywords:**
Digital Signal Processor (DSP),
Integer Neural Network(INN),
Low Cost Neural Network,
Integer Neural Network DSPImplementation.

##### 2887 Optimizing the Probabilistic Neural Network Training Algorithm for Multi-Class Identification

**Authors:**
Abdelhadi Lotfi,
Abdelkader Benyettou

**Abstract:**

In this work, a training algorithm for probabilistic neural networks (PNN) is presented. The algorithm addresses one of the major drawbacks of PNN, which is the size of the hidden layer in the network. By using a cross-validation training algorithm, the number of hidden neurons is shrunk to a smaller number consisting of the most representative samples of the training set. This is done without affecting the overall architecture of the network. Performance of the network is compared against performance of standard PNN for different databases from the UCI database repository. Results show an important gain in network size and performance.

**Keywords:**
Classification,
probabilistic neural networks,
network optimization,
pattern recognition.

##### 2886 Inverse Problem Methodology for the Measurement of the Electromagnetic Parameters Using MLP Neural Network

**Authors:**
T. Hacib,
M. R. Mekideche,
N. Ferkha

**Abstract:**

**Keywords:**
Inverse problem,
MLP neural network,
parametersidentification,
FEM.

##### 2885 A Cognitive Model for Frequency Signal Classification

**Authors:**
Rui Antunes,
Fernando V. Coito

**Abstract:**

**Keywords:**
Neural Networks,
Signal Classification,
Adaptative
Filters,
Cognitive Neuroscience

##### 2884 Comparison between Beta Wavelets Neural Networks, RBF Neural Networks and Polynomial Approximation for 1D, 2DFunctions Approximation

**Authors:**
Wajdi Bellil,
Chokri Ben Amar,
Adel M. Alimi

**Abstract:**

This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polynomial approximation in term of 1-D and 2-D functions approximation. We present a novel wavelet neural network, based on Beta wavelets, for 1-D and 2-D functions approximation. Our purpose is to approximate an unknown function f: Rn - R from scattered samples (xi; y = f(xi)) i=1....n, where first, we have little a priori knowledge on the unknown function f: it lives in some infinite dimensional smooth function space and second the function approximation process is performed iteratively: each new measure on the function (xi; f(xi)) is used to compute a new estimate Ôêºf as an approximation of the function f. Simulation results are demonstrated to validate the generalization ability and efficiency of the proposed Beta wavelet network.

**Keywords:**
Beta wavelets networks,
RBF neural network,
training algorithms,
MSE,
1-D,
2D function approximation.

##### 2883 Margin-Based Feed-Forward Neural Network Classifiers

**Authors:**
Han Xiao,
Xiaoyan Zhu

**Abstract:**

**Keywords:**
Max-Margin Principle,
Feed-Forward Neural Network,
Classifier.

##### 2882 Facial Emotion Recognition with Convolutional Neural Network Based Architecture

**Authors:**
Koray U. Erbas

**Abstract:**

Neural networks are appealing for many applications since they are able to learn complex non-linear relationships between input and output data. As the number of neurons and layers in a neural network increase, it is possible to represent more complex relationships with automatically extracted features. Nowadays Deep Neural Networks (DNNs) are widely used in Computer Vision problems such as; classification, object detection, segmentation image editing etc. In this work, Facial Emotion Recognition task is performed by proposed Convolutional Neural Network (CNN)-based DNN architecture using FER2013 Dataset. Moreover, the effects of different hyperparameters (activation function, kernel size, initializer, batch size and network size) are investigated and ablation study results for Pooling Layer, Dropout and Batch Normalization are presented.

**Keywords:**
Convolutional Neural Network,
Deep Learning,
Deep Learning Based FER,
Facial Emotion Recognition.

##### 2881 Neural Network Controller for Mobile Robot Motion Control

**Authors:**
Jasmin Velagic,
Nedim Osmic,
Bakir Lacevic

**Abstract:**

**Keywords:**
Mobile robot,
kinematic model,
neural network,
motion control,
adaptive learning rate.

##### 2880 Application of Functional Network to Solving Classification Problems

**Authors:**
Yong-Quan Zhou,
Deng-Xu He,
Zheng Nong

**Abstract:**

**Keywords:**
Functional network,
neural network,
XOR problem,
classification,
numerical analysis method.

##### 2879 Sociological Impact on Education An Analytical Approach Through Artificial Neural network

**Authors:**
P. R. Jayathilaka,
K.L. Jayaratne,
H.L. Premaratne

**Abstract:**

**Keywords:**
Education,
Fuzzy,
neural network,
prediction,
Sociology

##### 2878 Neural Network Based Predictive DTC Algorithm for Induction Motors

**Authors:**
N.Vahdatifar,
Ss.Mortazavi,
R.Kianinezhad

**Abstract:**

**Keywords:**
Neural Networks,
Predictive DTC

##### 2877 Research on Reservoir Lithology Prediction Based on Residual Neural Network and Squeeze-and- Excitation Neural Network

**Authors:**
Li Kewen,
Su Zhaoxin,
Wang Xingmou,
Zhu Jian Bing

**Abstract:**

Conventional reservoir prediction methods ar not sufficient to explore the implicit relation between seismic attributes, and thus data utilization is low. In order to improve the predictive classification accuracy of reservoir lithology, this paper proposes a deep learning lithology prediction method based on ResNet (Residual Neural Network) and SENet (Squeeze-and-Excitation Neural Network). The neural network model is built and trained by using seismic attribute data and lithology data of Shengli oilfield, and the nonlinear mapping relationship between seismic attribute and lithology marker is established. The experimental results show that this method can significantly improve the classification effect of reservoir lithology, and the classification accuracy is close to 70%. This study can effectively predict the lithology of undrilled area and provide support for exploration and development.

**Keywords:**
Convolutional neural network,
lithology,
prediction of reservoir lithology,
seismic attributes.

##### 2876 Nonlinear Adaptive PID Control for a Semi-Batch Reactor Based On an RBF Network

**Authors:**
Magdi M. Nabi,
Ding-Li Yu

**Abstract:**

Control of a semi-batch polymerization reactor using an adaptive radial basis function (RBF) neural network method is investigated in this paper. A neural network inverse model is used to estimate the valve position of the reactor; this method can identify the controlled system with the RBF neural network identifier. The weights of the adaptive PID controller are timely adjusted based on the identification of the plant and self-learning capability of RBFNN. A PID controller is used in the feedback control to regulate the actual temperature by compensating the neural network inverse model output. Simulation results show that the proposed control has strong adaptability, robustness and satisfactory control performance and the nonlinear system is achieved.

**Keywords:**
Chylla-Haase polymerization reactor,
RBF neural
networks,
feed-forward and feedback control.

##### 2875 Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions

**Authors:**
Anita S. Gangal,
P. K. Kalra,
D. S. Chauhan

**Abstract:**

**Keywords:**
Complex backpropagation algorithm,
complex errorfunctions,
complex valued neural network,
split activation function.

##### 2874 Prediction of Natural Gas Viscosity using Artificial Neural Network Approach

**Authors:**
E. Nemati Lay,
M. Peymani,
E. Sanjari

**Abstract:**

Prediction of viscosity of natural gas is an important parameter in the energy industries such as natural gas storage and transportation. In this study viscosity of different compositions of natural gas is modeled by using an artificial neural network (ANN) based on back-propagation method. A reliable database including more than 3841 experimental data of viscosity for testing and training of ANN is used. The designed neural network can predict the natural gas viscosity using pseudo-reduced pressure and pseudo-reduced temperature with AARD% of 0.221. The accuracy of designed ANN has been compared to other published empirical models. The comparison indicates that the proposed method can provide accurate results.

**Keywords:**
Artificial neural network,
Empirical correlation,
Natural gas,
Viscosity