Search results for: Stochastic recurrent neural networks
2657 PTH Moment Exponential Stability of Stochastic Recurrent Neural Networks with Distributed Delays
Authors: Zixin Liu, Jianjun Jiao Wanping Bai
Abstract:
In this paper, the issue of pth moment exponential stability of stochastic recurrent neural network with distributed time delays is investigated. By using the method of variation parameters, inequality techniques, and stochastic analysis, some sufficient conditions ensuring pth moment exponential stability are obtained. The method used in this paper does not resort to any Lyapunov function, and the results derived in this paper generalize some earlier criteria reported in the literature. One numerical example is given to illustrate the main results.
Keywords: Stochastic recurrent neural networks, pth moment exponential stability, distributed time delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12522656 Passivity Analysis of Stochastic Neural Networks With Multiple Time Delays
Authors: Biao Qin, Jin Huang, Jiaojiao Ren, Wei Kang
Abstract:
This paper deals with the problem of passivity analysis for stochastic neural networks with leakage, discrete and distributed delays. By using delay partitioning technique, free weighting matrix method and stochastic analysis technique, several sufficient conditions for the passivity of the addressed neural networks are established in terms of linear matrix inequalities (LMIs), in which both the time-delay and its time derivative can be fully considered. A numerical example is given to show the usefulness and effectiveness of the obtained results.
Keywords: Passivity, Stochastic neural networks, Multiple time delays, Linear matrix inequalities (LMIs).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17022655 Stability Analysis of Impulsive Stochastic Fuzzy Cellular Neural Networks with Time-varying Delays and Reaction-diffusion Terms
Authors: Xinhua Zhang, Kelin Li
Abstract:
In this paper, the problem of stability analysis for a class of impulsive stochastic fuzzy neural networks with timevarying delays and reaction-diffusion is considered. By utilizing suitable Lyapunov-Krasovskii funcational, the inequality technique and stochastic analysis technique, some sufficient conditions ensuring global exponential stability of equilibrium point for impulsive stochastic fuzzy cellular neural networks with time-varying delays and diffusion are obtained. In particular, the estimate of the exponential convergence rate is also provided, which depends on system parameters, diffusion effect and impulsive disturbed intention. It is believed that these results are significant and useful for the design and applications of fuzzy neural networks. An example is given to show the effectiveness of the obtained results.
Keywords: Exponential stability, stochastic fuzzy cellular neural networks, time-varying delays, impulses, reaction-diffusion terms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13812654 Mean Square Exponential Synchronization of Stochastic Neutral Type Chaotic Neural Networks with Mixed Delay
Authors: Zixin Liu, Huawei Yang, Fangwei Chen
Abstract:
This paper studies the mean square exponential synchronization problem of a class of stochastic neutral type chaotic neural networks with mixed delay. On the Basis of Lyapunov stability theory, some sufficient conditions ensuring the mean square exponential synchronization of two identical chaotic neural networks are obtained by using stochastic analysis and inequality technique. These conditions are expressed in the form of linear matrix inequalities (LMIs), whose feasibility can be easily checked by using Matlab LMI Toolbox. The feedback controller used in this paper is more general than those used in previous literatures. One simulation example is presented to demonstrate the effectiveness of the derived results.
Keywords: Exponential synchronization, stochastic analysis, chaotic neural networks, neutral type system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15572653 New PTH Moment Stable Criteria of Stochastic Neural Networks
Authors: Zixin Liu, Huawei Yang, Fangwei Chen
Abstract:
In this paper, the issue of pth moment stability of a class of stochastic neural networks with mixed delays is investigated. By establishing two integro-differential inequalities, some new sufficient conditions ensuring pth moment exponential stability are obtained. Compared with some previous publications, our results generalize some earlier works reported in the literature, and remove some strict constraints of time delays and kernel functions. Two numerical examples are presented to illustrate the validity of the main results.
Keywords: Neural networks, stochastic, PTH moment stable, time varying delays, distributed delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14692652 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata
Authors: Pavan K. Rallabandi, Kailash C. Patidar
Abstract:
In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence/pattern recognition/classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.Keywords: Hybrid systems, Hidden Markov Models, Recurrent neural networks, Deterministic finite state automata.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28832651 Periodic Solutions of Recurrent Neural Networks with Distributed Delays and Impulses on Time Scales
Authors: Yaping Ren, Yongkun Li
Abstract:
In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.
Keywords: Recurrent neural networks, global exponential stability, periodic solutions, distributed delays, impulses, time scales.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15942650 Existence and Exponential Stability of Almost Periodic Solution for Recurrent Neural Networks on Time Scales
Abstract:
In this paper, a class of recurrent neural networks (RNNs) with variable delays are studied on almost periodic time scales, some sufficient conditions are established for the existence and global exponential stability of the almost periodic solution. These results have important leading significance in designs and applications of RNNs. Finally, two examples and numerical simulations are presented to illustrate the feasibility and effectiveness of the results.
Keywords: Recurrent neural network, Almost periodic solution, Global exponential stability, Time scale.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14082649 Multi-Context Recurrent Neural Network for Time Series Applications
Authors: B. Q. Huang, Tarik Rashid, M-T. Kechadi
Abstract:
this paper presents a multi-context recurrent network for time series analysis. While simple recurrent network (SRN) are very popular among recurrent neural networks, they still have some shortcomings in terms of learning speed and accuracy that need to be addressed. To solve these problems, we proposed a multi-context recurrent network (MCRN) with three different learning algorithms. The performance of this network is evaluated on some real-world application such as handwriting recognition and energy load forecasting. We study the performance of this network and we compared it to a very well established SRN. The experimental results showed that MCRN is very efficient and very well suited to time series analysis and its applications.
Keywords: Gradient descent method, recurrent neural network, learning algorithms, time series, BP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30392648 Experimental Study of Hyperparameter Tuning a Deep Learning Convolutional Recurrent Network for Text Classification
Authors: Bharatendra Rai
Abstract:
Sequences of words in text data have long-term dependencies and are known to suffer from vanishing gradient problem when developing deep learning models. Although recurrent networks such as long short-term memory networks help overcome this problem, achieving high text classification performance is a challenging problem. Convolutional recurrent networks that combine advantages of long short-term memory networks and convolutional neural networks, can be useful for text classification performance improvements. However, arriving at suitable hyperparameter values for convolutional recurrent networks is still a challenging task where fitting of a model requires significant computing resources. This paper illustrates the advantages of using convolutional recurrent networks for text classification with the help of statistically planned computer experiments for hyperparameter tuning.
Keywords: Convolutional recurrent networks, hyperparameter tuning, long short-term memory networks, Tukey honest significant differences
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1132647 Deep Learning Based, End-to-End Metaphor Detection in Greek with Recurrent and Convolutional Neural Networks
Authors: Konstantinos Perifanos, Eirini Florou, Dionysis Goutsos
Abstract:
This paper presents and benchmarks a number of end-to-end Deep Learning based models for metaphor detection in Greek. We combine Convolutional Neural Networks and Recurrent Neural Networks with representation learning to bear on the metaphor detection problem for the Greek language. The models presented achieve exceptional accuracy scores, significantly improving the previous state-of-the-art results, which had already achieved accuracy 0.82. Furthermore, no special preprocessing, feature engineering or linguistic knowledge is used in this work. The methods presented achieve accuracy of 0.92 and F-score 0.92 with Convolutional Neural Networks (CNNs) and bidirectional Long Short Term Memory networks (LSTMs). Comparable results of 0.91 accuracy and 0.91 F-score are also achieved with bidirectional Gated Recurrent Units (GRUs) and Convolutional Recurrent Neural Nets (CRNNs). The models are trained and evaluated only on the basis of training tuples, the related sentences and their labels. The outcome is a state-of-the-art collection of metaphor detection models, trained on limited labelled resources, which can be extended to other languages and similar tasks.Keywords: Metaphor detection, deep learning, representation learning, embeddings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5532646 Auto-regressive Recurrent Neural Network Approach for Electricity Load Forecasting
Authors: Tarik Rashid, B. Q. Huang, M-T. Kechadi, B. Gleeson
Abstract:
this paper presents an auto-regressive network called the Auto-Regressive Multi-Context Recurrent Neural Network (ARMCRN), which forecasts the daily peak load for two large power plant systems. The auto-regressive network is a combination of both recurrent and non-recurrent networks. Weather component variables are the key elements in forecasting because any change in these variables affects the demand of energy load. So the AR-MCRN is used to learn the relationship between past, previous, and future exogenous and endogenous variables. Experimental results show that using the change in weather components and the change that occurred in past load as inputs to the AR-MCRN, rather than the basic weather parameters and past load itself as inputs to the same network, produce higher accuracy of predicted load. Experimental results also show that using exogenous and endogenous variables as inputs is better than using only the exogenous variables as inputs to the network.
Keywords: Daily peak load forecasting, neural networks, recurrent neural networks, auto regressive multi-context neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25432645 Predicting Global Solar Radiation Using Recurrent Neural Networks and Climatological Parameters
Authors: Rami El-Hajj Mohamad, Mahmoud Skafi, Ali Massoud Haidar
Abstract:
Several meteorological parameters were used for the prediction of monthly average daily global solar radiation on horizontal using recurrent neural networks (RNNs). Climatological data and measures, mainly air temperature, humidity, sunshine duration, and wind speed between 1995 and 2007 were used to design and validate a feed forward and recurrent neural network based prediction systems. In this paper we present our reference system based on a feed-forward multilayer perceptron (MLP) as well as the proposed approach based on an RNN model. The obtained results were promising and comparable to those obtained by other existing empirical and neural models. The experimental results showed the advantage of RNNs over simple MLPs when we deal with time series solar radiation predictions based on daily climatological data.
Keywords: Recurrent Neural Networks, Global Solar Radiation, Multi-layer perceptron, gradient, Root Mean Square Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25602644 Improved Exponential Stability Analysis for Delayed Recurrent Neural Networks
Authors: Miaomiao Yang, Shouming Zhong
Abstract:
This paper studies the problem of exponential stability analysis for recurrent neural networks with time-varying delay.By establishing a suitable augmented LyapunovCKrasovskii function and a novel sufficient condition is obtained to guarantee the exponential stability of the considered system.In order to get a less conservative results of the condition,zero equalities and reciprocally convex approach are employed. The several exponential stability criterion proposed in this paper is simpler and effective. A numerical example is provided to demonstrate the feasibility and effectiveness of our results.
Keywords: Exponential stability , Neural networks, Linear matrix inequality, Lyapunov-Krasovskii, Time-varying.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17652643 Augmented Lyapunov Approach to Robust Stability of Discrete-time Stochastic Neural Networks with Time-varying Delays
Authors: Shu Lü, Shouming Zhong, Zixin Liu
Abstract:
In this paper, the robust exponential stability problem of discrete-time uncertain stochastic neural networks with timevarying delays is investigated. By introducing a new augmented Lyapunov function, some delay-dependent stable results are obtained in terms of linear matrix inequality (LMI) technique. Compared with some existing results in the literature, the conservatism of the new criteria is reduced notably. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.
Keywords: Robust exponential stability, delay-dependent stability, discrete-time neural networks, stochastic, time-varying delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14352642 pth Moment Exponential Synchronization of a Class of Chaotic Neural Networks with Mixed Delays
Authors: Zixin Liu, Shu Lü, Shouming Zhong, Mao Ye
Abstract:
This paper studies the pth moment exponential synchronization of a class of stochastic neural networks with mixed delays. Based on Lyapunov stability theory, by establishing a new integrodifferential inequality with mixed delays, several sufficient conditions have been derived to ensure the pth moment exponential stability for the error system. The criteria extend and improve some earlier results. One numerical example is presented to illustrate the validity of the main results.
Keywords: pth Moment Exponential synchronization, Stochastic, Neural networks, Mixed time delays
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15772641 Almost Periodicity in a Harvesting Lotka-Volterra Recurrent Neural Networks with Time-Varying Delays
Authors: Yongzhi Liao
Abstract:
By using the theory of exponential dichotomy and Banach fixed point theorem, this paper is concerned with the problem of the existence and uniqueness of positive almost periodic solution in a delayed Lotka-Volterra recurrent neural networks with harvesting terms. To a certain extent, our work in this paper corrects some result in recent years. Finally, an example is given to illustrate the feasibility and effectiveness of the main result.
Keywords: positive almost periodic solution, Lotka-Volterra, neural networks, Banach fixed point theorem, harvesting
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16232640 Novel Delay-Dependent Stability Criteria for Uncertain Discrete-Time Stochastic Neural Networks with Time-Varying Delays
Authors: Mengzhuo Luo, Shouming Zhong
Abstract:
This paper investigates the problem of exponential stability for a class of uncertain discrete-time stochastic neural network with time-varying delays. By constructing a suitable Lyapunov-Krasovskii functional, combining the stochastic stability theory, the free-weighting matrix method, a delay-dependent exponential stability criteria is obtained in term of LMIs. Compared with some previous results, the new conditions obtain in this paper are less conservative. Finally, two numerical examples are exploited to show the usefulness of the results derived.
Keywords: Delay-dependent stability, Neural networks, Time varying delay, Linear matrix inequality (LMI).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19262639 Evolutionary Training of Hybrid Systems of Recurrent Neural Networks and Hidden Markov Models
Authors: Rohitash Chandra, Christian W. Omlin
Abstract:
We present a hybrid architecture of recurrent neural networks (RNNs) inspired by hidden Markov models (HMMs). We train the hybrid architecture using genetic algorithms to learn and represent dynamical systems. We train the hybrid architecture on a set of deterministic finite-state automata strings and observe the generalization performance of the hybrid architecture when presented with a new set of strings which were not present in the training data set. In this way, we show that the hybrid system of HMM and RNN can learn and represent deterministic finite-state automata. We ran experiments with different sets of population sizes in the genetic algorithm; we also ran experiments to find out which weight initializations were best for training the hybrid architecture. The results show that the hybrid architecture of recurrent neural networks inspired by hidden Markov models can train and represent dynamical systems. The best training and generalization performance is achieved when the hybrid architecture is initialized with random real weight values of range -15 to 15.Keywords: Deterministic finite-state automata, genetic algorithm, hidden Markov models, hybrid systems and recurrent neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18892638 A New Technique for Solar Activity Forecasting Using Recurrent Elman Networks
Authors: Salvatore Marra, Francesco C. Morabito
Abstract:
In this paper we present an efficient approach for the prediction of two sunspot-related time series, namely the Yearly Sunspot Number and the IR5 Index, that are commonly used for monitoring solar activity. The method is based on exploiting partially recurrent Elman networks and it can be divided into three main steps: the first one consists in a “de-rectification" of the time series under study in order to obtain a new time series whose appearance, similar to a sum of sinusoids, can be modelled by our neural networks much better than the original dataset. After that, we normalize the derectified data so that they have zero mean and unity standard deviation and, finally, train an Elman network with only one input, a recurrent hidden layer and one output using a back-propagation algorithm with variable learning rate and momentum. The achieved results have shown the efficiency of this approach that, although very simple, can perform better than most of the existing solar activity forecasting methods.
Keywords: Elman neural networks, sunspot, solar activity, time series prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18532637 Synchronization for Impulsive Fuzzy Cohen-Grossberg Neural Networks with Time Delays under Noise Perturbation
Authors: Changzhao Li, Juan Zhang
Abstract:
In this paper, we investigate a class of fuzzy Cohen- Grossberg neural networks with time delays and impulsive effects. By virtue of stochastic analysis, Halanay inequality for stochastic differential equations, we find sufficient conditions for the global exponential square-mean synchronization of the FCGNNs under noise perturbation. In particular, the traditional assumption on the differentiability of the time-varying delays is no longer needed. Finally, a numerical example is given to show the effectiveness of the results in this paper.
Keywords: Fuzzy Cohen-Grossberg neural networks (FCGNNs), complete synchronization, time delays, impulsive, noise perturbation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13432636 Neural Network Ensemble-based Solar Power Generation Short-Term Forecasting
Authors: A. Chaouachi, R.M. Kamel, R. Ichikawa, H. Hayashi, K. Nagasaka
Abstract:
This paper presents the applicability of artificial neural networks for 24 hour ahead solar power generation forecasting of a 20 kW photovoltaic system, the developed forecasting is suitable for a reliable Microgrid energy management. In total four neural networks were proposed, namely: multi-layred perceptron, radial basis function, recurrent and a neural network ensemble consisting in ensemble of bagged networks. Forecasting reliability of the proposed neural networks was carried out in terms forecasting error performance basing on statistical and graphical methods. The experimental results showed that all the proposed networks achieved an acceptable forecasting accuracy. In term of comparison the neural network ensemble gives the highest precision forecasting comparing to the conventional networks. In fact, each network of the ensemble over-fits to some extent and leads to a diversity which enhances the noise tolerance and the forecasting generalization performance comparing to the conventional networks.Keywords: Neural network ensemble, Solar power generation, 24 hour forecasting, Comparative study
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32752635 Crude Oil Price Prediction Using LSTM Networks
Authors: Varun Gupta, Ankit Pandey
Abstract:
Crude oil market is an immensely complex and dynamic environment and thus the task of predicting changes in such an environment becomes challenging with regards to its accuracy. A number of approaches have been adopted to take on that challenge and machine learning has been at the core in many of them. There are plenty of examples of algorithms based on machine learning yielding satisfactory results for such type of prediction. In this paper, we have tried to predict crude oil prices using Long Short-Term Memory (LSTM) based recurrent neural networks. We have tried to experiment with different types of models using different epochs, lookbacks and other tuning methods. The results obtained are promising and presented a reasonably accurate prediction for the price of crude oil in near future.
Keywords: Crude oil price prediction, deep learning, LSTM, recurrent neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37102634 Delay-Distribution-Dependent Stability Criteria for BAM Neural Networks with Time-Varying Delays
Authors: J.H. Park, S. Lakshmanan, H.Y. Jung, S.M. Lee
Abstract:
This paper is concerned with the delay-distributiondependent stability criteria for bidirectional associative memory (BAM) neural networks with time-varying delays. Based on the Lyapunov-Krasovskii functional and stochastic analysis approach, a delay-probability-distribution-dependent sufficient condition is derived to achieve the globally asymptotically mean square stable of the considered BAM neural networks. The criteria are formulated in terms of a set of linear matrix inequalities (LMIs), which can be checked efficiently by use of some standard numerical packages. Finally, a numerical example and its simulation is given to demonstrate the usefulness and effectiveness of the proposed results.Keywords: BAM neural networks, Probabilistic time-varying delays, Stability criteria.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14172633 A Combined Neural Network Approach to Soccer Player Prediction
Authors: Wenbin Zhang, Hantian Wu, Jian Tang
Abstract:
An artificial neural network is a mathematical model inspired by biological neural networks. There are several kinds of neural networks and they are widely used in many areas, such as: prediction, detection, and classification. Meanwhile, in day to day life, people always have to make many difficult decisions. For example, the coach of a soccer club has to decide which offensive player to be selected to play in a certain game. This work describes a novel Neural Network using a combination of the General Regression Neural Network and the Probabilistic Neural Networks to help a soccer coach make an informed decision.
Keywords: General Regression Neural Network, Probabilistic Neural Networks, Neural function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 37612632 Recurrent Neural Network Based Fuzzy Inference System for Identification and Control of Dynamic Plants
Authors: Rahib Hidayat Abiyev
Abstract:
This paper presents the development of recurrent neural network based fuzzy inference system for identification and control of dynamic nonlinear plant. The structure and algorithms of fuzzy system based on recurrent neural network are described. To train unknown parameters of the system the supervised learning algorithm is used. As a result of learning, the rules of neuro-fuzzy system are formed. The neuro-fuzzy system is used for the identification and control of nonlinear dynamic plant. The simulation results of identification and control systems based on recurrent neuro-fuzzy network are compared with the simulation results of other neural systems. It is found that the recurrent neuro-fuzzy based system has better performance than the others.
Keywords: Fuzzy logic, neural network, neuro-fuzzy system, control system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23732631 The Multi-Layered Perceptrons Neural Networks for the Prediction of Daily Solar Radiation
Authors: Radouane Iqdour, Abdelouhab Zeroual
Abstract:
The Multi-Layered Perceptron (MLP) Neural networks have been very successful in a number of signal processing applications. In this work we have studied the possibilities and the met difficulties in the application of the MLP neural networks for the prediction of daily solar radiation data. We have used the Polack-Ribière algorithm for training the neural networks. A comparison, in term of the statistical indicators, with a linear model most used in literature, is also performed, and the obtained results show that the neural networks are more efficient and gave the best results.Keywords: Daily solar radiation, Prediction, MLP neural networks, linear model
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13272630 A Fast Neural Algorithm for Serial Code Detection in a Stream of Sequential Data
Authors: Hazem M. El-Bakry, Qiangfu Zhao
Abstract:
In recent years, fast neural networks for object/face detection have been introduced based on cross correlation in the frequency domain between the input matrix and the hidden weights of neural networks. In our previous papers [3,4], fast neural networks for certain code detection was introduced. It was proved in [10] that for fast neural networks to give the same correct results as conventional neural networks, both the weights of neural networks and the input matrix must be symmetric. This condition made those fast neural networks slower than conventional neural networks. Another symmetric form for the input matrix was introduced in [1-9] to speed up the operation of these fast neural networks. Here, corrections for the cross correlation equations (given in [13,15,16]) to compensate for the symmetry condition are presented. After these corrections, it is proved mathematically that the number of computation steps required for fast neural networks is less than that needed by classical neural networks. Furthermore, there is no need for converting the input data into symmetric form. Moreover, such new idea is applied to increase the speed of neural networks in case of processing complex values. Simulation results after these corrections using MATLAB confirm the theoretical computations.
Keywords: Fast Code/Data Detection, Neural Networks, Cross Correlation, real/complex values.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16262629 Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory
Authors: Danilo López, Nelson Vera, Luis Pedraza
Abstract:
This paper analyzes fundamental ideas and concepts related to neural networks, which provide the reader a theoretical explanation of Long Short-Term Memory (LSTM) networks operation classified as Deep Learning Systems, and to explicitly present the mathematical development of Backward Pass equations of the LSTM network model. This mathematical modeling associated with software development will provide the necessary tools to develop an intelligent system capable of predicting the behavior of licensed users in wireless cognitive radio networks.Keywords: Neural networks, multilayer perceptron, long short-term memory, recurrent neuronal network, mathematical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15612628 Efficient System for Speech Recognition using General Regression Neural Network
Authors: Abderrahmane Amrouche, Jean Michel Rouvaen
Abstract:
In this paper we present an efficient system for independent speaker speech recognition based on neural network approach. The proposed architecture comprises two phases: a preprocessing phase which consists in segmental normalization and features extraction and a classification phase which uses neural networks based on nonparametric density estimation namely the general regression neural network (GRNN). The relative performances of the proposed model are compared to the similar recognition systems based on the Multilayer Perceptron (MLP), the Recurrent Neural Network (RNN) and the well known Discrete Hidden Markov Model (HMM-VQ) that we have achieved also. Experimental results obtained with Arabic digits have shown that the use of nonparametric density estimation with an appropriate smoothing factor (spread) improves the generalization power of the neural network. The word error rate (WER) is reduced significantly over the baseline HMM method. GRNN computation is a successful alternative to the other neural network and DHMM.Keywords: Speech Recognition, General Regression NeuralNetwork, Hidden Markov Model, Recurrent Neural Network, ArabicDigits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2182