Search results for: Recurrent neural networks
2425 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata
Authors: Pavan K. Rallabandi, Kailash C. Patidar
Abstract:
In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence/pattern recognition/classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.Keywords: Hybrid systems, Hidden Markov Models, Recurrent neural networks, Deterministic finite state automata.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28832424 Periodic Solutions of Recurrent Neural Networks with Distributed Delays and Impulses on Time Scales
Authors: Yaping Ren, Yongkun Li
Abstract:
In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.
Keywords: Recurrent neural networks, global exponential stability, periodic solutions, distributed delays, impulses, time scales.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15942423 Existence and Exponential Stability of Almost Periodic Solution for Recurrent Neural Networks on Time Scales
Abstract:
In this paper, a class of recurrent neural networks (RNNs) with variable delays are studied on almost periodic time scales, some sufficient conditions are established for the existence and global exponential stability of the almost periodic solution. These results have important leading significance in designs and applications of RNNs. Finally, two examples and numerical simulations are presented to illustrate the feasibility and effectiveness of the results.
Keywords: Recurrent neural network, Almost periodic solution, Global exponential stability, Time scale.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14082422 Multi-Context Recurrent Neural Network for Time Series Applications
Authors: B. Q. Huang, Tarik Rashid, M-T. Kechadi
Abstract:
this paper presents a multi-context recurrent network for time series analysis. While simple recurrent network (SRN) are very popular among recurrent neural networks, they still have some shortcomings in terms of learning speed and accuracy that need to be addressed. To solve these problems, we proposed a multi-context recurrent network (MCRN) with three different learning algorithms. The performance of this network is evaluated on some real-world application such as handwriting recognition and energy load forecasting. We study the performance of this network and we compared it to a very well established SRN. The experimental results showed that MCRN is very efficient and very well suited to time series analysis and its applications.
Keywords: Gradient descent method, recurrent neural network, learning algorithms, time series, BP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30392421 Experimental Study of Hyperparameter Tuning a Deep Learning Convolutional Recurrent Network for Text Classification
Authors: Bharatendra Rai
Abstract:
Sequences of words in text data have long-term dependencies and are known to suffer from vanishing gradient problem when developing deep learning models. Although recurrent networks such as long short-term memory networks help overcome this problem, achieving high text classification performance is a challenging problem. Convolutional recurrent networks that combine advantages of long short-term memory networks and convolutional neural networks, can be useful for text classification performance improvements. However, arriving at suitable hyperparameter values for convolutional recurrent networks is still a challenging task where fitting of a model requires significant computing resources. This paper illustrates the advantages of using convolutional recurrent networks for text classification with the help of statistically planned computer experiments for hyperparameter tuning.
Keywords: Convolutional recurrent networks, hyperparameter tuning, long short-term memory networks, Tukey honest significant differences
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1132420 Deep Learning Based, End-to-End Metaphor Detection in Greek with Recurrent and Convolutional Neural Networks
Authors: Konstantinos Perifanos, Eirini Florou, Dionysis Goutsos
Abstract:
This paper presents and benchmarks a number of end-to-end Deep Learning based models for metaphor detection in Greek. We combine Convolutional Neural Networks and Recurrent Neural Networks with representation learning to bear on the metaphor detection problem for the Greek language. The models presented achieve exceptional accuracy scores, significantly improving the previous state-of-the-art results, which had already achieved accuracy 0.82. Furthermore, no special preprocessing, feature engineering or linguistic knowledge is used in this work. The methods presented achieve accuracy of 0.92 and F-score 0.92 with Convolutional Neural Networks (CNNs) and bidirectional Long Short Term Memory networks (LSTMs). Comparable results of 0.91 accuracy and 0.91 F-score are also achieved with bidirectional Gated Recurrent Units (GRUs) and Convolutional Recurrent Neural Nets (CRNNs). The models are trained and evaluated only on the basis of training tuples, the related sentences and their labels. The outcome is a state-of-the-art collection of metaphor detection models, trained on limited labelled resources, which can be extended to other languages and similar tasks.Keywords: Metaphor detection, deep learning, representation learning, embeddings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5532419 Auto-regressive Recurrent Neural Network Approach for Electricity Load Forecasting
Authors: Tarik Rashid, B. Q. Huang, M-T. Kechadi, B. Gleeson
Abstract:
this paper presents an auto-regressive network called the Auto-Regressive Multi-Context Recurrent Neural Network (ARMCRN), which forecasts the daily peak load for two large power plant systems. The auto-regressive network is a combination of both recurrent and non-recurrent networks. Weather component variables are the key elements in forecasting because any change in these variables affects the demand of energy load. So the AR-MCRN is used to learn the relationship between past, previous, and future exogenous and endogenous variables. Experimental results show that using the change in weather components and the change that occurred in past load as inputs to the AR-MCRN, rather than the basic weather parameters and past load itself as inputs to the same network, produce higher accuracy of predicted load. Experimental results also show that using exogenous and endogenous variables as inputs is better than using only the exogenous variables as inputs to the network.
Keywords: Daily peak load forecasting, neural networks, recurrent neural networks, auto regressive multi-context neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25432418 Predicting Global Solar Radiation Using Recurrent Neural Networks and Climatological Parameters
Authors: Rami El-Hajj Mohamad, Mahmoud Skafi, Ali Massoud Haidar
Abstract:
Several meteorological parameters were used for the prediction of monthly average daily global solar radiation on horizontal using recurrent neural networks (RNNs). Climatological data and measures, mainly air temperature, humidity, sunshine duration, and wind speed between 1995 and 2007 were used to design and validate a feed forward and recurrent neural network based prediction systems. In this paper we present our reference system based on a feed-forward multilayer perceptron (MLP) as well as the proposed approach based on an RNN model. The obtained results were promising and comparable to those obtained by other existing empirical and neural models. The experimental results showed the advantage of RNNs over simple MLPs when we deal with time series solar radiation predictions based on daily climatological data.
Keywords: Recurrent Neural Networks, Global Solar Radiation, Multi-layer perceptron, gradient, Root Mean Square Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25602417 PTH Moment Exponential Stability of Stochastic Recurrent Neural Networks with Distributed Delays
Authors: Zixin Liu, Jianjun Jiao Wanping Bai
Abstract:
In this paper, the issue of pth moment exponential stability of stochastic recurrent neural network with distributed time delays is investigated. By using the method of variation parameters, inequality techniques, and stochastic analysis, some sufficient conditions ensuring pth moment exponential stability are obtained. The method used in this paper does not resort to any Lyapunov function, and the results derived in this paper generalize some earlier criteria reported in the literature. One numerical example is given to illustrate the main results.
Keywords: Stochastic recurrent neural networks, pth moment exponential stability, distributed time delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12522416 Improved Exponential Stability Analysis for Delayed Recurrent Neural Networks
Authors: Miaomiao Yang, Shouming Zhong
Abstract:
This paper studies the problem of exponential stability analysis for recurrent neural networks with time-varying delay.By establishing a suitable augmented LyapunovCKrasovskii function and a novel sufficient condition is obtained to guarantee the exponential stability of the considered system.In order to get a less conservative results of the condition,zero equalities and reciprocally convex approach are employed. The several exponential stability criterion proposed in this paper is simpler and effective. A numerical example is provided to demonstrate the feasibility and effectiveness of our results.
Keywords: Exponential stability , Neural networks, Linear matrix inequality, Lyapunov-Krasovskii, Time-varying.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17652415 Almost Periodicity in a Harvesting Lotka-Volterra Recurrent Neural Networks with Time-Varying Delays
Authors: Yongzhi Liao
Abstract:
By using the theory of exponential dichotomy and Banach fixed point theorem, this paper is concerned with the problem of the existence and uniqueness of positive almost periodic solution in a delayed Lotka-Volterra recurrent neural networks with harvesting terms. To a certain extent, our work in this paper corrects some result in recent years. Finally, an example is given to illustrate the feasibility and effectiveness of the main result.
Keywords: positive almost periodic solution, Lotka-Volterra, neural networks, Banach fixed point theorem, harvesting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16232414 Evolutionary Training of Hybrid Systems of Recurrent Neural Networks and Hidden Markov Models
Authors: Rohitash Chandra, Christian W. Omlin
Abstract:
We present a hybrid architecture of recurrent neural networks (RNNs) inspired by hidden Markov models (HMMs). We train the hybrid architecture using genetic algorithms to learn and represent dynamical systems. We train the hybrid architecture on a set of deterministic finite-state automata strings and observe the generalization performance of the hybrid architecture when presented with a new set of strings which were not present in the training data set. In this way, we show that the hybrid system of HMM and RNN can learn and represent deterministic finite-state automata. We ran experiments with different sets of population sizes in the genetic algorithm; we also ran experiments to find out which weight initializations were best for training the hybrid architecture. The results show that the hybrid architecture of recurrent neural networks inspired by hidden Markov models can train and represent dynamical systems. The best training and generalization performance is achieved when the hybrid architecture is initialized with random real weight values of range -15 to 15.Keywords: Deterministic finite-state automata, genetic algorithm, hidden Markov models, hybrid systems and recurrent neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18892413 A New Technique for Solar Activity Forecasting Using Recurrent Elman Networks
Authors: Salvatore Marra, Francesco C. Morabito
Abstract:
In this paper we present an efficient approach for the prediction of two sunspot-related time series, namely the Yearly Sunspot Number and the IR5 Index, that are commonly used for monitoring solar activity. The method is based on exploiting partially recurrent Elman networks and it can be divided into three main steps: the first one consists in a “de-rectification" of the time series under study in order to obtain a new time series whose appearance, similar to a sum of sinusoids, can be modelled by our neural networks much better than the original dataset. After that, we normalize the derectified data so that they have zero mean and unity standard deviation and, finally, train an Elman network with only one input, a recurrent hidden layer and one output using a back-propagation algorithm with variable learning rate and momentum. The achieved results have shown the efficiency of this approach that, although very simple, can perform better than most of the existing solar activity forecasting methods.
Keywords: Elman neural networks, sunspot, solar activity, time series prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18532412 Neural Network Ensemble-based Solar Power Generation Short-Term Forecasting
Authors: A. Chaouachi, R.M. Kamel, R. Ichikawa, H. Hayashi, K. Nagasaka
Abstract:
This paper presents the applicability of artificial neural networks for 24 hour ahead solar power generation forecasting of a 20 kW photovoltaic system, the developed forecasting is suitable for a reliable Microgrid energy management. In total four neural networks were proposed, namely: multi-layred perceptron, radial basis function, recurrent and a neural network ensemble consisting in ensemble of bagged networks. Forecasting reliability of the proposed neural networks was carried out in terms forecasting error performance basing on statistical and graphical methods. The experimental results showed that all the proposed networks achieved an acceptable forecasting accuracy. In term of comparison the neural network ensemble gives the highest precision forecasting comparing to the conventional networks. In fact, each network of the ensemble over-fits to some extent and leads to a diversity which enhances the noise tolerance and the forecasting generalization performance comparing to the conventional networks.Keywords: Neural network ensemble, Solar power generation, 24 hour forecasting, Comparative study
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32752411 Crude Oil Price Prediction Using LSTM Networks
Authors: Varun Gupta, Ankit Pandey
Abstract:
Crude oil market is an immensely complex and dynamic environment and thus the task of predicting changes in such an environment becomes challenging with regards to its accuracy. A number of approaches have been adopted to take on that challenge and machine learning has been at the core in many of them. There are plenty of examples of algorithms based on machine learning yielding satisfactory results for such type of prediction. In this paper, we have tried to predict crude oil prices using Long Short-Term Memory (LSTM) based recurrent neural networks. We have tried to experiment with different types of models using different epochs, lookbacks and other tuning methods. The results obtained are promising and presented a reasonably accurate prediction for the price of crude oil in near future.
Keywords: Crude oil price prediction, deep learning, LSTM, recurrent neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37092410 A Combined Neural Network Approach to Soccer Player Prediction
Authors: Wenbin Zhang, Hantian Wu, Jian Tang
Abstract:
An artificial neural network is a mathematical model inspired by biological neural networks. There are several kinds of neural networks and they are widely used in many areas, such as: prediction, detection, and classification. Meanwhile, in day to day life, people always have to make many difficult decisions. For example, the coach of a soccer club has to decide which offensive player to be selected to play in a certain game. This work describes a novel Neural Network using a combination of the General Regression Neural Network and the Probabilistic Neural Networks to help a soccer coach make an informed decision.
Keywords: General Regression Neural Network, Probabilistic Neural Networks, Neural function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37612409 Recurrent Neural Network Based Fuzzy Inference System for Identification and Control of Dynamic Plants
Authors: Rahib Hidayat Abiyev
Abstract:
This paper presents the development of recurrent neural network based fuzzy inference system for identification and control of dynamic nonlinear plant. The structure and algorithms of fuzzy system based on recurrent neural network are described. To train unknown parameters of the system the supervised learning algorithm is used. As a result of learning, the rules of neuro-fuzzy system are formed. The neuro-fuzzy system is used for the identification and control of nonlinear dynamic plant. The simulation results of identification and control systems based on recurrent neuro-fuzzy network are compared with the simulation results of other neural systems. It is found that the recurrent neuro-fuzzy based system has better performance than the others.
Keywords: Fuzzy logic, neural network, neuro-fuzzy system, control system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23732408 The Multi-Layered Perceptrons Neural Networks for the Prediction of Daily Solar Radiation
Authors: Radouane Iqdour, Abdelouhab Zeroual
Abstract:
The Multi-Layered Perceptron (MLP) Neural networks have been very successful in a number of signal processing applications. In this work we have studied the possibilities and the met difficulties in the application of the MLP neural networks for the prediction of daily solar radiation data. We have used the Polack-Ribière algorithm for training the neural networks. A comparison, in term of the statistical indicators, with a linear model most used in literature, is also performed, and the obtained results show that the neural networks are more efficient and gave the best results.Keywords: Daily solar radiation, Prediction, MLP neural networks, linear model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13272407 A Fast Neural Algorithm for Serial Code Detection in a Stream of Sequential Data
Authors: Hazem M. El-Bakry, Qiangfu Zhao
Abstract:
In recent years, fast neural networks for object/face detection have been introduced based on cross correlation in the frequency domain between the input matrix and the hidden weights of neural networks. In our previous papers [3,4], fast neural networks for certain code detection was introduced. It was proved in [10] that for fast neural networks to give the same correct results as conventional neural networks, both the weights of neural networks and the input matrix must be symmetric. This condition made those fast neural networks slower than conventional neural networks. Another symmetric form for the input matrix was introduced in [1-9] to speed up the operation of these fast neural networks. Here, corrections for the cross correlation equations (given in [13,15,16]) to compensate for the symmetry condition are presented. After these corrections, it is proved mathematically that the number of computation steps required for fast neural networks is less than that needed by classical neural networks. Furthermore, there is no need for converting the input data into symmetric form. Moreover, such new idea is applied to increase the speed of neural networks in case of processing complex values. Simulation results after these corrections using MATLAB confirm the theoretical computations.
Keywords: Fast Code/Data Detection, Neural Networks, Cross Correlation, real/complex values.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16262406 Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory
Authors: Danilo López, Nelson Vera, Luis Pedraza
Abstract:
This paper analyzes fundamental ideas and concepts related to neural networks, which provide the reader a theoretical explanation of Long Short-Term Memory (LSTM) networks operation classified as Deep Learning Systems, and to explicitly present the mathematical development of Backward Pass equations of the LSTM network model. This mathematical modeling associated with software development will provide the necessary tools to develop an intelligent system capable of predicting the behavior of licensed users in wireless cognitive radio networks.Keywords: Neural networks, multilayer perceptron, long short-term memory, recurrent neuronal network, mathematical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15612405 Efficient System for Speech Recognition using General Regression Neural Network
Authors: Abderrahmane Amrouche, Jean Michel Rouvaen
Abstract:
In this paper we present an efficient system for independent speaker speech recognition based on neural network approach. The proposed architecture comprises two phases: a preprocessing phase which consists in segmental normalization and features extraction and a classification phase which uses neural networks based on nonparametric density estimation namely the general regression neural network (GRNN). The relative performances of the proposed model are compared to the similar recognition systems based on the Multilayer Perceptron (MLP), the Recurrent Neural Network (RNN) and the well known Discrete Hidden Markov Model (HMM-VQ) that we have achieved also. Experimental results obtained with Arabic digits have shown that the use of nonparametric density estimation with an appropriate smoothing factor (spread) improves the generalization power of the neural network. The word error rate (WER) is reduced significantly over the baseline HMM method. GRNN computation is a successful alternative to the other neural network and DHMM.Keywords: Speech Recognition, General Regression NeuralNetwork, Hidden Markov Model, Recurrent Neural Network, ArabicDigits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21822404 Bi-lingual Handwritten Character and Numeral Recognition using Multi-Dimensional Recurrent Neural Networks (MDRNN)
Authors: Kandarpa Kumar Sarma
Abstract:
The key to the continued success of ANN depends, considerably, on the use of hybrid structures implemented on cooperative frame-works. Hybrid architectures provide the ability to the ANN to validate heterogeneous learning paradigms. This work describes the implementation of a set of Distributed and Hybrid ANN models for Character Recognition applied to Anglo-Assamese scripts. The objective is to describe the effectiveness of Hybrid ANN setups as innovative means of neural learning for an application like multilingual handwritten character and numeral recognition.Keywords: Assamese, Feature, Recurrent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15312403 Vision-Based Collision Avoidance for Unmanned Aerial Vehicles by Recurrent Neural Networks
Authors: Yao-Hong Tsai
Abstract:
Due to the sensor technology, video surveillance has become the main way for security control in every big city in the world. Surveillance is usually used by governments for intelligence gathering, the prevention of crime, the protection of a process, person, group or object, or the investigation of crime. Many surveillance systems based on computer vision technology have been developed in recent years. Moving target tracking is the most common task for Unmanned Aerial Vehicle (UAV) to find and track objects of interest in mobile aerial surveillance for civilian applications. The paper is focused on vision-based collision avoidance for UAVs by recurrent neural networks. First, images from cameras on UAV were fused based on deep convolutional neural network. Then, a recurrent neural network was constructed to obtain high-level image features for object tracking and extracting low-level image features for noise reducing. The system distributed the calculation of the whole system to local and cloud platform to efficiently perform object detection, tracking and collision avoidance based on multiple UAVs. The experiments on several challenging datasets showed that the proposed algorithm outperforms the state-of-the-art methods.Keywords: Unmanned aerial vehicle, object tracking, deep learning, collision avoidance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9522402 Unknown Environment Representation for Mobile Robot Using Spiking Neural Networks
Authors: Amir Reza Saffari Azar Alamdari
Abstract:
In this paper, a model of self-organizing spiking neural networks is introduced and applied to mobile robot environment representation and path planning problem. A network of spike-response-model neurons with a recurrent architecture is used to create robot-s internal representation from surrounding environment. The overall activity of network simulates a self-organizing system with unsupervised learning. A modified A* algorithm is used to find the best path using this internal representation between starting and goal points. This method can be used with good performance for both known and unknown environments.
Keywords: Mobile Robot, Path Planning, Self-organization, Spiking Neural Networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14912401 Fast Complex Valued Time Delay Neural Networks
Authors: Hazem M. El-Bakry, Qiangfu Zhao
Abstract:
Here, a new idea to speed up the operation of complex valued time delay neural networks is presented. The whole data are collected together in a long vector and then tested as a one input pattern. The proposed fast complex valued time delay neural networks uses cross correlation in the frequency domain between the tested data and the input weights of neural networks. It is proved mathematically that the number of computation steps required for the presented fast complex valued time delay neural networks is less than that needed by classical time delay neural networks. Simulation results using MATLAB confirm the theoretical computations.Keywords: Fast Complex Valued Time Delay Neural Networks, Cross Correlation, Frequency Domain
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18242400 Application of Wavelet Neural Networks in Optimization of Skeletal Buildings under Frequency Constraints
Authors: Mohammad Reza Ghasemi, Amin Ghorbani
Abstract:
The main goal of the present work is to decrease the computational burden for optimum design of steel frames with frequency constraints using a new type of neural networks called Wavelet Neural Network. It is contested to train a suitable neural network for frequency approximation work as the analysis program. The combination of wavelet theory and Neural Networks (NN) has lead to the development of wavelet neural networks. Wavelet neural networks are feed-forward networks using wavelet as activation function. Wavelets are mathematical functions within suitable inner parameters, which help them to approximate arbitrary functions. WNN was used to predict the frequency of the structures. In WNN a RAtional function with Second order Poles (RASP) wavelet was used as a transfer function. It is shown that the convergence speed was faster than other neural networks. Also comparisons of WNN with the embedded Artificial Neural Network (ANN) and with approximate techniques and also with analytical solutions are available in the literature.Keywords: Weight Minimization, Frequency Constraints, Steel Frames, ANN, WNN, RASP Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17382399 Improved Robust Stability Criteria for Discrete-time Neural Networks
Authors: Zixin Liu, Shu Lü, Shouming Zhong, Mao Ye
Abstract:
In this paper, the robust exponential stability problem of uncertain discrete-time recurrent neural networks with timevarying delay is investigated. By constructing a new augmented Lyapunov-Krasovskii function, some new improved stability criteria are obtained in forms of linear matrix inequality (LMI). Compared with some recent results in literature, the conservatism of the new criteria is reduced notably. Two numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed results.
Keywords: Robust exponential stability, delay-dependent stability, discrete-time neutral networks, time-varying delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14762398 Diagnosis of Ovarian Cancer with Proteomic Patterns in Serum using Independent Component Analysis and Neural Networks
Authors: Simone C. F. Neves, Lúcio F. A. Campos, Ewaldo Santana, Ginalber L. O. Serra, Allan K. Barros
Abstract:
We propose a method for discrimination and classification of ovarian with benign, malignant and normal tissue using independent component analysis and neural networks. The method was tested for a proteomic patters set from A database, and radial basis functions neural networks. The best performance was obtained with probabilistic neural networks, resulting I 99% success rate, with 98% of specificity e 100% of sensitivity.Keywords: Cancer ovarian, Proteomic patterns in serum, independent component analysis and neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18302397 Reactive Neural Control for Phototaxis and Obstacle Avoidance Behavior of Walking Machines
Authors: Poramate Manoonpong, Frank Pasemann, Florentin Wörgötter
Abstract:
This paper describes reactive neural control used to generate phototaxis and obstacle avoidance behavior of walking machines. It utilizes discrete-time neurodynamics and consists of two main neural modules: neural preprocessing and modular neural control. The neural preprocessing network acts as a sensory fusion unit. It filters sensory noise and shapes sensory data to drive the corresponding reactive behavior. On the other hand, modular neural control based on a central pattern generator is applied for locomotion of walking machines. It coordinates leg movements and can generate omnidirectional walking. As a result, through a sensorimotor loop this reactive neural controller enables the machines to explore a dynamic environment by avoiding obstacles, turn toward a light source, and then stop near to it.Keywords: Recurrent neural networks, Walking robots, Modular neural control, Phototaxis, Obstacle avoidance behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17272396 Analysis of Periodic Solution of Delay Fuzzy BAM Neural Networks
Authors: Qianhong Zhang, Lihui Yang, Daixi Liao
Abstract:
In this paper, by employing a new Lyapunov functional and an elementary inequality analysis technique, some sufficient conditions are derived to ensure the existence and uniqueness of periodic oscillatory solution for fuzzy bi-directional memory (BAM) neural networks with time-varying delays, and all other solutions of the fuzzy BAM neural networks converge the uniqueness periodic solution. These criteria are presented in terms of system parameters and have important leading significance in the design and applications of neural networks. Moreover an example is given to illustrate the effectiveness and feasible of results obtained.Keywords: Fuzzy BAM neural networks, Periodic solution, Global exponential stability, Time-varying delays
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1514