Search results for: probability distributions
679 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy
Authors: Zviad Ghadua, Biswa Bhattacharya
Abstract:
The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.
Keywords: Flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 750678 Stochastic Risk Analysis Framework for Building Construction Projects
Authors: Abdulkadir Abu Lawal
Abstract:
The study was carried out to establish the probability density function of some selected building construction projects of similar complexity delivered using Bill of Quantities (BQ) and Lump Sum (LS) forms of contract, and to draw a reliability scenario for each form of contract. 30 of such delivered projects are analyzed for each of the contract forms using Weibull Analysis, and their Weibull functions (α, and β) are determined based on their completion times. For the BQ form of contract delivered projects, α is calculated as 1.6737E20 and β as + 0.0115 and for the LS form, α is found to be 5.6556E03 and β is determined as + 0.4535. Using these values, respective probability density functions are calculated and plotted, as handy tool for risk analysis of future projects of similar characteristics. By input of variables from other projects, decision making processes can be made for a whole project or its components using EVM Analysis in project evaluation and review techniques. This framework, as a quantitative approach, depends on the assumption of normality in projects completion time, it can help greatly in determining the completion time probability for veritable projects using any of the contract forms under consideration. Projects aspects that are not amenable to measurement, on the other hand, can be analyzed using fuzzy sets and fuzzy logic. This scenario can be drawn for different types of building construction projects, and using different suitable forms of contract in projects delivery.
Keywords: Building construction, Projects, Forms of contract, Probability density function, Reliability scenario.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782677 Performance of Soft Handover Algorithm in Varied Propagation Environments
Authors: N. P. Singh, Brahmjit Singh
Abstract:
CDMA cellular networks support soft handover, which guarantees the continuity of wireless services and enhanced communication quality. Cellular networks support multimedia services under varied propagation environmental conditions. In this paper, we have shown the effect of characteristic parameters of the cellular environments on the soft handover performance. We consider path loss exponent, standard deviation of shadow fading and correlation coefficient of shadow fading as the characteristic parameters of the radio propagation environment. A very useful statistical measure for characterizing the performance of mobile radio system is the probability of outage. It is shown through numerical results that above parameters have decisive effect on the probability of outage and hence the overall performance of the soft handover algorithm.Keywords: CDMA, Correlation coefficient, Path loss exponent, Probability of outage, Soft handover.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723676 Electromagnetic Field Modeling in Human Tissue
Authors: Iliana Marinova, Valentin Mateev
Abstract:
For investigations of electromagnetic field distributions in biological structures by Finite Element Method (FEM), a method for automatic 3D model building of human anatomical objects is developed. Models are made by meshed structures and specific electromagnetic material properties for each tissue type. Mesh is built according to specific FEM criteria for achieving good solution accuracy. Several FEM models of anatomical objects are built. Formulation using magnetic vector potential and scalar electric potential (A-V, A) is used for modeling of electromagnetic fields in human tissue objects. The developed models are suitable for investigations of electromagnetic field distributions in human tissues exposed in external fields during magnetic stimulation, defibrillation, impedance tomography etc.Keywords: electromagnetic field, finite element method, humantissue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5295675 Analysis of GI/M(n)/1/N Queue with Single Working Vacation and Vacation Interruption
Authors: P. Vijaya Laxmi, V. Goswami, V. Suchitra
Abstract:
This paper presents a finite buffer renewal input single working vacation and vacation interruption queue with state dependent services and state dependent vacations, which has a wide range of applications in several areas including manufacturing, wireless communication systems. Service times during busy period, vacation period and vacation times are exponentially distributed and are state dependent. As a result of the finite waiting space, state dependent services and state dependent vacation policies, the analysis of these queueing models needs special attention. We provide a recursive method using the supplementary variable technique to compute the stationary queue length distributions at pre-arrival and arbitrary epochs. An efficient computational algorithm of the model is presented which is fast and accurate and easy to implement. Various performance measures have been discussed. Finally, some special cases and numerical results have been depicted in the form of tables and graphs.
Keywords: State Dependent Service, Vacation Interruption, Supplementary Variable, Single Working Vacation, Blocking Probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158674 Cognitive Relaying in Interference Limited Spectrum Sharing Environment: Outage Probability and Outage Capacity
Authors: Md Fazlul Kader, Soo Young Shin
Abstract:
In this paper, we consider a cognitive relay network (CRN) in which the primary receiver (PR) is protected by peak transmit power ¯PST and/or peak interference power Q constraints. In addition, the interference effect from the primary transmitter (PT) is considered to show its impact on the performance of the CRN. We investigate the outage probability (OP) and outage capacity (OC) of the CRN by deriving closed-form expressions over Rayleigh fading channel. Results show that both the OP and OC improve by increasing the cooperative relay nodes as well as when the PT is far away from the SR.Keywords: Cognitive relay, outage, interference limited, decode-and-forward (DF).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1909673 Establishing a Probabilistic Model of Extrapolated Wind Speed Data for Wind Energy Prediction
Authors: Mussa I. Mgwatu, Reuben R. M. Kainkwa
Abstract:
Wind is among the potential energy resources which can be harnessed to generate wind energy for conversion into electrical power. Due to the variability of wind speed with time and height, it becomes difficult to predict the generated wind energy more optimally. In this paper, an attempt is made to establish a probabilistic model fitting the wind speed data recorded at Makambako site in Tanzania. Wind speeds and direction were respectively measured using anemometer (type AN1) and wind Vane (type WD1) both supplied by Delta-T-Devices at a measurement height of 2 m. Wind speeds were then extrapolated for the height of 10 m using power law equation with an exponent of 0.47. Data were analysed using MINITAB statistical software to show the variability of wind speeds with time and height, and to determine the underlying probability model of the extrapolated wind speed data. The results show that wind speeds at Makambako site vary cyclically over time; and they conform to the Weibull probability distribution. From these results, Weibull probability density function can be used to predict the wind energy.Keywords: Probabilistic models, wind speed, wind energy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2349672 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.
Keywords: Conditional Generative Adversarial Net, market and credit risk management, neural network, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1200671 The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers
Authors: Quanru Pan
Abstract:
How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.Keywords: variable input rates, impatient customer, variable servicerates, profit maximization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1963670 Reliability Evaluation of Composite Electric Power System Based On Latin Hypercube Sampling
Authors: R. Ashok Bakkiyaraj, N. Kumarappan
Abstract:
This paper investigates the suitability of Latin Hypercube sampling (LHS) for composite electric power system reliability analysis. Each sample generated in LHS is mapped into an equivalent system state and used for evaluating the annualized system and load point indices. DC loadflow based state evaluation model is solved for each sampled contingency state. The indices evaluated are loss of load probability, loss of load expectation, expected demand not served and expected energy not supplied. The application of the LHS is illustrated through case studies carried out using RBTS and IEEE-RTS test systems. Results obtained are compared with non-sequential Monte Carlo simulation and state enumeration analytical approaches. An error analysis is also carried out to check the LHS method’s ability to capture the distributions of the reliability indices. It is found that LHS approach estimates indices nearer to actual value and gives tighter bounds of indices than non-sequential Monte Carlo simulation.
Keywords: Composite power system, Latin Hypercube sampling, Monte Carlo simulation, Reliability evaluation, Variance analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3109669 Forecast Based on an Empirical Probability Function with an Adjusted Error Using Propagation of Error
Authors: Oscar Javier Herrera, Manuel Ángel Camacho
Abstract:
This paper addresses a cutting edge method of business demand forecasting, based on an empirical probability function when the historical behavior of the data is random. Additionally, it presents error determination based on the numerical method technique ‘propagation of errors.’ The methodology was conducted characterization and process diagnostics demand planning as part of the production management, then new ways to predict its value through techniques of probability and to calculate their mistake investigated, it was tools used numerical methods. All this based on the behavior of the data. This analysis was determined considering the specific business circumstances of a company in the sector of communications, located in the city of Bogota, Colombia. In conclusion, using this application it was possible to obtain the adequate stock of the products required by the company to provide its services, helping the company reduce its service time, increase the client satisfaction rate, reduce stock which has not been in rotation for a long time, code its inventory, and plan reorder points for the replenishment of stock.Keywords: Demand Forecasting, Empirical Distribution, Propagation of Error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844668 An Optimal Unsupervised Satellite image Segmentation Approach Based on Pearson System and k-Means Clustering Algorithm Initialization
Authors: Ahmed Rekik, Mourad Zribi, Ahmed Ben Hamida, Mohamed Benjelloun
Abstract:
This paper presents an optimal and unsupervised satellite image segmentation approach based on Pearson system and k-Means Clustering Algorithm Initialization. Such method could be considered as original by the fact that it utilised K-Means clustering algorithm for an optimal initialisation of image class number on one hand and it exploited Pearson system for an optimal statistical distributions- affectation of each considered class on the other hand. Satellite image exploitation requires the use of different approaches, especially those founded on the unsupervised statistical segmentation principle. Such approaches necessitate definition of several parameters like image class number, class variables- estimation and generalised mixture distributions. Use of statistical images- attributes assured convincing and promoting results under the condition of having an optimal initialisation step with appropriated statistical distributions- affectation. Pearson system associated with a k-means clustering algorithm and Stochastic Expectation-Maximization 'SEM' algorithm could be adapted to such problem. For each image-s class, Pearson system attributes one distribution type according to different parameters and especially the Skewness 'β1' and the kurtosis 'β2'. The different adapted algorithms, K-Means clustering algorithm, SEM algorithm and Pearson system algorithm, are then applied to satellite image segmentation problem. Efficiency of those combined algorithms was firstly validated with the Mean Quadratic Error 'MQE' evaluation, and secondly with visual inspection along several comparisons of these unsupervised images- segmentation.
Keywords: Unsupervised classification, Pearson system, Satellite image, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2041667 Creating Maintenance Cost Model for University Buildings
Authors: AbdulLateef A. Olanrewaju, Arazi Idrus, Mohd F. Khamidi
Abstract:
Maintenance costs incurred on building differs. The difference can be as results of the types, functions, age, building health index, size, form height, location and complexity of the building. These are contributing to the difficulty in maintenance development of deterministic maintenance cost model. This paper is concerns with reporting the preliminary findings on the creation of building maintenance cost distributions for universities in Malaysia. This study is triggered by the need to provide guides on maintenance costs distributions for decision making. For this purpose, a survey questionnaire was conducted to investigate the distribution of maintenance costs in the universities. Altogether, responses were received from twenty universities comprising both private and publicly owned. The research found that engineering services, roofing and finishes were the elements contributing the larger segment of the maintenance costs. Furthermore, the study indicates the significance of maintenance cost distribution as decision making tool towards maintenance management.Keywords: Performance matrix, university buildings, costmodel, Malaysia
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2038666 Investigation and Calculation of Seismic Reliability of Structures
Authors: Panam. Zarfam, Mohsen. Javan Pour
Abstract:
Recently, analysis and designing of the structures based on the Reliability theory have been the center of attention. Reason of this attention is the existence of the natural and random structural parameters such as the material specification, external loads, geometric dimensions etc. By means of the Reliability theory, uncertainties resulted from the statistical nature of the structural parameters can be changed into the mathematical equations and the safety and operational considerations can be considered in the designing process. According to this theory, it is possible to study the destruction probability of not only a specific element but also the entire system. Therefore, after being assured of safety of every element, their reciprocal effects on the safety of the entire system can be investigated.Keywords: Probability, Reliability, Statistics, Uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600665 Confidence Intervals for the Normal Mean with Known Coefficient of Variation
Authors: Suparat Niwitpong
Abstract:
In this paper we proposed two new confidence intervals for the normal population mean with known coefficient of variation. This situation occurs normally in environment and agriculture experiments where the scientist knows the coefficient of variation of their experiments. We propose two new confidence intervals for this problem based on the recent work of Searls [5] and the new method proposed in this paper for the first time. We derive analytic expressions for the coverage probability and the expected length of each confidence interval. Monte Carlo simulation will be used to assess the performance of these intervals based on their expected lengths.
Keywords: confidence interval, coverage probability, expected length, known coefficient of variation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651664 Performance Verification of Seismic Design Codes for RC Frames
Authors: Payam Asadi, Ali Bakhshi
Abstract:
In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.
Keywords: RC frame, fragility curve, performance-base design, life-cycle cost analyses, seismic design codes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1939663 Dempster-Shafer Evidence Theory for Image Segmentation: Application in Cells Images
Authors: S. Ben Chaabane, M. Sayadi, F. Fnaiech, E. Brassart
Abstract:
In this paper we propose a new knowledge model using the Dempster-Shafer-s evidence theory for image segmentation and fusion. The proposed method is composed essentially of two steps. First, mass distributions in Dempster-Shafer theory are obtained from the membership degrees of each pixel covering the three image components (R, G and B). Each membership-s degree is determined by applying Fuzzy C-Means (FCM) clustering to the gray levels of the three images. Second, the fusion process consists in defining three discernment frames which are associated with the three images to be fused, and then combining them to form a new frame of discernment. The strategy used to define mass distributions in the combined framework is discussed in detail. The proposed fusion method is illustrated in the context of image segmentation. Experimental investigations and comparative studies with the other previous methods are carried out showing thus the robustness and superiority of the proposed method in terms of image segmentation.Keywords: Fuzzy C-means, Color image, data fusion, Dempster-Shafer's evidence theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2200662 Estimation of Time -Varying Linear Regression with Unknown Time -Volatility via Continuous Generalization of the Akaike Information Criterion
Authors: Elena Ezhova, Vadim Mottl, Olga Krasotkina
Abstract:
The problem of estimating time-varying regression is inevitably concerned with the necessity to choose the appropriate level of model volatility - ranging from the full stationarity of instant regression models to their absolute independence of each other. In the stationary case the number of regression coefficients to be estimated equals that of regressors, whereas the absence of any smoothness assumptions augments the dimension of the unknown vector by the factor of the time-series length. The Akaike Information Criterion is a commonly adopted means of adjusting a model to the given data set within a succession of nested parametric model classes, but its crucial restriction is that the classes are rigidly defined by the growing integer-valued dimension of the unknown vector. To make the Kullback information maximization principle underlying the classical AIC applicable to the problem of time-varying regression estimation, we extend it onto a wider class of data models in which the dimension of the parameter is fixed, but the freedom of its values is softly constrained by a family of continuously nested a priori probability distributions.Keywords: Time varying regression, time-volatility of regression coefficients, Akaike Information Criterion (AIC), Kullback information maximization principle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1534661 Analysis of Different Combining Schemes of Two Amplify-Forward Relay Branches with Individual Links Experiencing Nakagami Fading
Authors: Babu Sena Paul, Ratnajit Bhattacharjee
Abstract:
Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.
Keywords: co-operative diversity, diversity combining, maximal ratio combining, selection combining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607660 Base Change for Fisher Metrics: Case of the q−Gaussian Inverse Distribution
Authors: Gabriel I. Loaiza O., Carlos A. Cadavid M., Juan C. Arango P.
Abstract:
It is known that the Riemannian manifold determined by the family of inverse Gaussian distributions endowed with the Fisher metric has negative constant curvature κ = −1/2 , as does the family of usual Gaussian distributions. In the present paper, firstly we arrive at this result by following a different path, much simpler than the previous ones. We first put the family in exponential form, thus endowing the family with a new set of parameters, or coordinates, θ1, θ2; then we determine the matrix of the Fisher metric in terms of these parameters; and finally we compute this matrix in the original parameters. Secondly, we define the Inverse q−Gaussian distribution family (q < 3), as the family obtained by replacing the usual exponential function by the Tsallis q−exponential function in the expression for the Inverse Gaussian distribution, and observe that it supports two possible geometries, the Fisher and the q−Fisher geometry. And finally, we apply our strategy to obtain results about the Fisher and q−Fisher geometry of the Inverse q−Gaussian distribution family, similar to the ones obtained in the case of the Inverse Gaussian distribution family.
Keywords: Base of Changes, Information Geometry, Inverse Gaussian distribution, Inverse q-Gaussian distribution, Statistical Manifolds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 389659 Health Risk Assessment of Heavy Metals Adsorbed in Particulates
Authors: Sadovska V.
Abstract:
The progress of concentrations of particular heavy metals was assessed in chosen localities in region Moravia, the Czech Republic, from 2007 to 2009. Particular metals were observed in localities with various types and characterization of zone. Pb, Ni, As and Cd were emphasized as a result of their toxicity and potential adverse health effect to the exposed population. The progress of metal concentrations and their health effects in the most polluted localities were examined. According to the results, the air pollution limit values were not exceeded. Based on the health risk assessment, the probability of developing tumorous diseases is acceptable, except for the increased probability of cancer risk from long-term exposure to As.
Keywords: Air pollution, heavy metals, health risk assessment, individual lifetime cancer risk
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2410658 Data-driven Multiscale Tsallis Complexity: Application to EEG Analysis
Authors: Young-Seok Choi
Abstract:
This work proposes a data-driven multiscale based quantitative measures to reveal the underlying complexity of electroencephalogram (EEG), applying to a rodent model of hypoxic-ischemic brain injury and recovery. Motivated by that real EEG recording is nonlinear and non-stationary over different frequencies or scales, there is a need of more suitable approach over the conventional single scale based tools for analyzing the EEG data. Here, we present a new framework of complexity measures considering changing dynamics over multiple oscillatory scales. The proposed multiscale complexity is obtained by calculating entropies of the probability distributions of the intrinsic mode functions extracted by the empirical mode decomposition (EMD) of EEG. To quantify EEG recording of a rat model of hypoxic-ischemic brain injury following cardiac arrest, the multiscale version of Tsallis entropy is examined. To validate the proposed complexity measure, actual EEG recordings from rats (n=9) experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Experimental results demonstrate that the use of the multiscale Tsallis entropy leads to better discrimination of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective metric as a prognostic tool.
Keywords: Electroencephalogram (EEG), multiscale complexity, empirical mode decomposition, Tsallis entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2063657 Adaptive Radio Resource Allocation for Multiple Traffic OFDMA Broadband Wireless Access System
Authors: Lu Yanhui, Zhang Lizhi, Yin Changchuan, Yue Guangxin
Abstract:
In this paper, an adaptive radio resource allocation (RRA) algorithm applying to multiple traffic OFDMA system is proposed, which distributes sub-carrier and loading bits among users according to their different QoS requirements and traffic class. By classifying and prioritizing the users based on their traffic characteristic and ensuring resource for higher priority users, the scheme decreases tremendously the outage probability of the users requiring a real time transmission without impact on the spectrum efficiency of system, as well as the outage probability of data users is not increased compared with the RRA methods published.Keywords: OFDMA, adaptive radio resource allocation, QoS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1689656 Complex Condition Monitoring System of Aircraft Gas Turbine Engine
Authors: A. M. Pashayev, D. D. Askerov, C. Ardil, R. A. Sadiqov, P. S. Abdullayev
Abstract:
Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE workand output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-by-stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.Keywords: aviation gas turbine engine, technical condition, fuzzy logic, neural networks, fuzzy statistics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2545655 Computing Transition Intensity Using Time-Homogeneous Markov Jump Process: Case of South African HIV/AIDS Disposition
Authors: A. Bayaga
Abstract:
This research provides a technical account of estimating Transition Probability using Time-homogeneous Markov Jump Process applying by South African HIV/AIDS data from the Statistics South Africa. It employs Maximum Likelihood Estimator (MLE) model to explore the possible influence of Transition Probability of mortality cases in which case the data was based on actual Statistics South Africa. This was conducted via an integrated demographic and epidemiological model of South African HIV/AIDS epidemic. The model was fitted to age-specific HIV prevalence data and recorded death data using MLE model. Though the previous model results suggest HIV in South Africa has declined and AIDS mortality rates have declined since 2002 – 2013, in contrast, our results differ evidently with the generally accepted HIV models (Spectrum/EPP and ASSA2008) in South Africa. However, there is the need for supplementary research to be conducted to enhance the demographic parameters in the model and as well apply it to each of the nine (9) provinces of South Africa.
Keywords: AIDS mortality rates, Epidemiological model, Time-homogeneous Markov Jump Process, Transition Probability, Statistics South Africa.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2171654 Three Dimensional Numerical Simulation of a Full Scale CANDU Reactor Moderator to Study Temperature Fluctuations
Authors: A. Sarchami, N. Ashgriz, M. Kwee
Abstract:
Threedimensional numerical simulations are conducted on a full scale CANDU Moderator and Transient variations of the temperature and velocity distributions inside the tank are determined. The results show that the flow and temperature distributions inside the moderator tank are three dimensional and no symmetry plane can be identified.Competition between the upward moving buoyancy driven flows and the downward moving momentum driven flows, results in the formation of circulation zones. The moderator tank operates in the buoyancy driven mode and any small disturbances in the flow or temperature makes the system unstable and asymmetric. Different types of temperature fluctuations are noted inside the tank: (i) large amplitude are at the boundaries between the hot and cold (ii) low amplitude are in the core of the tank (iii) high frequency fluctuations are in the regions with high velocities and (iv) low frequency fluctuations are in the regions with lower velocities.
Keywords: Bruce, Fluctuations, Numerical, Temperature, Thermal hydraulics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930653 Life Experiences are Important Factors of Making Stronger SOC (Sense of Coherence) on the Workers in Tsukuba Research Park City (TRPC)
Authors: Shinichiro Sasahara, Yusuke Tomotsune, Yuichi Ohi, Shun Suzuki, Akihiro Seki, Junko Sakano, Yoshihiko Yamazaki, Ichiyo Matsuzaki
Abstract:
Via a large scale cross-sectional study among Japanese white color workers, the authors aimed to elucidate: (1) the distributions of Sense of Coherence (SOC), which reflect stress coping abilities, (2) the distributions of Life experience; (3) and the association between SOC and Life experience. Anonymous self-administered questionnaires were sent to 15,891 in 2001 and 21,922 in 2011 employees at educational and research institutions in Tsukuba Research Park City. A total of 5,868 (36.9%) and 9,528 (43.5%) respectively workers completed and returned the questionnaire; 5,715 and 9,515 respectively workers without missing data were analyzed. SOC scale scores differed by gender, age, and other demographic features in both study years. Among the life experiences, workers who have got over parenting or management position were higher SOC scale scores adjusted by gender and age. The life experiences that workers have got over could develop their stronger SOC in their life course.
Keywords: field study, life experience, mental health, SOC (sense of coherence)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1538652 Reliability Based Performance Evaluation of Stone Column Improved Soft Ground
Authors: A. GuhaRay, C. V. S. P. Kiranmayi, S. Rudraraju
Abstract:
The present study considers the effect of variation of different geotechnical random variables in the design of stone column-foundation systems for assessing the bearing capacity and consolidation settlement of highly compressible soil. The soil and stone column properties, spacing, diameter and arrangement of stone columns are considered as the random variables. Probability of failure (Pf) is computed for a target degree of consolidation and a target safe load by Monte Carlo Simulation (MCS). The study shows that the variation in coefficient of radial consolidation (cr) and cohesion of soil (cs) are two most important factors influencing Pf. If the coefficient of variation (COV) of cr exceeds 20%, Pf exceeds 0.001, which is unsafe following the guidelines of US Army Corps of Engineers. The bearing capacity also exceeds its safe value for COV of cs > 30%. It is also observed that as the spacing between the stone column increases, the probability of reaching a target degree of consolidation decreases. Accordingly, design guidelines, considering both consolidation and bearing capacity of improved ground, are proposed for different spacing and diameter of stone columns and geotechnical random variables.
Keywords: Bearing capacity, consolidation, geotechnical random variables, probability of failure, stone columns.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1178651 Approximate Confidence Interval for Effect Size Base on Bootstrap Resampling Method
Authors: S. Phanyaem
Abstract:
This paper presents the confidence intervals for the effect size base on bootstrap resampling method. The meta-analytic confidence interval for effect size is proposed that are easy to compute. A Monte Carlo simulation study was conducted to compare the performance of the proposed confidence intervals with the existing confidence intervals. The best confidence interval method will have a coverage probability close to 0.95. Simulation results have shown that our proposed confidence intervals perform well in terms of coverage probability and expected length.Keywords: Effect size, confidence interval, Bootstrap Method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1147650 Convective Heat Transfer Enhancement in an Enclosure with Fin Utilizing Nano Fluids
Authors: S. H. Anilkumar, Ghulam Jilani
Abstract:
The objective of the present work is to conduct investigations leading to a more complete explanation of single phase natural convective heat transfer in an enclosure with fin utilizing nano fluids. The nano fluid used, which is composed of Aluminum oxide nano particles in suspension of Ethylene glycol, is provided at various volume fractions. The study is carried out numerically for a range of Rayleigh numbers, fin heights and aspect ratio. The flow and temperature distributions are taken to be two-dimensional. Regions with the same velocity and temperature distributions are identified as symmetry of sections. One half of such a rectangular region is chosen as the computational domain taking into account the symmetry about the fin. Transport equations are modeled by a stream functionvorticity formulation and are solved numerically by finite-difference schemes. Comparisons with previously published works on the basis of special cases are done. Results are presented in the form of streamline, vector and isotherm plots as well as the variation of local Nusselt number along the fin under different conditions.Keywords: Fin height, Nano fluid, natural convection, Rayleigh number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1470