Search results for: maximum likelihood estimator (MLE)
4608 X̄ and S Control Charts based on Weighted Standard Deviation Method
Authors: Derya Karagöz
Abstract:
A Shewhart chart based on normality assumption is not appropriate for skewed distributions since its Type-I error rate is inflated. This study presents X̄ and S control charts for monitoring the process variability for skewed distributions. We propose Weighted Standard Deviation (WSD) X̄ and S control charts. Standard deviation estimator is applied to monitor the process variability for estimating the process standard deviation, in the case of the W SD X̄ and S control charts as this estimator is simple and easy to compute. Unlike the Shewhart control chart, the proposed charts provide asymmetric limits in accordance with the direction and degree of skewness to construct the upper and lower limits. The performances of the proposed charts are compared with other heuristic charts for skewed distributions by using Simulation study. The Simulation studies show that the proposed control charts have good properties for skewed distributions and large sample sizes.Keywords: weighted standard deviation, MAD, skewed distributions, S control charts
Procedia PDF Downloads 3994607 Classification of Health Risk Factors to Predict the Risk of Falling in Older Adults
Authors: L. Lindsay, S. A. Coleman, D. Kerr, B. J. Taylor, A. Moorhead
Abstract:
Cognitive decline and frailty is apparent in older adults leading to an increased likelihood of the risk of falling. Currently health care professionals have to make professional decisions regarding such risks, and hence make difficult decisions regarding the future welfare of the ageing population. This study uses health data from The Irish Longitudinal Study on Ageing (TILDA), focusing on adults over the age of 50 years, in order to analyse health risk factors and predict the likelihood of falls. This prediction is based on the use of machine learning algorithms whereby health risk factors are used as inputs to predict the likelihood of falling. Initial results show that health risk factors such as long-term health issues contribute to the number of falls. The identification of such health risk factors has the potential to inform health and social care professionals, older people and their family members in order to mitigate daily living risks.Keywords: classification, falls, health risk factors, machine learning, older adults
Procedia PDF Downloads 1504606 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.Keywords: regression, piecewise, Bayesian, reversible Jump MCMC
Procedia PDF Downloads 5214605 Bayesian Parameter Inference for Continuous Time Markov Chains with Intractable Likelihood
Authors: Randa Alharbi, Vladislav Vyshemirsky
Abstract:
Systems biology is an important field in science which focuses on studying behaviour of biological systems. Modelling is required to produce detailed description of the elements of a biological system, their function, and their interactions. A well-designed model requires selecting a suitable mechanism which can capture the main features of the system, define the essential components of the system and represent an appropriate law that can define the interactions between its components. Complex biological systems exhibit stochastic behaviour. Thus, using probabilistic models are suitable to describe and analyse biological systems. Continuous-Time Markov Chain (CTMC) is one of the probabilistic models that describe the system as a set of discrete states with continuous time transitions between them. The system is then characterised by a set of probability distributions that describe the transition from one state to another at a given time. The evolution of these probabilities through time can be obtained by chemical master equation which is analytically intractable but it can be simulated. Uncertain parameters of such a model can be inferred using methods of Bayesian inference. Yet, inference in such a complex system is challenging as it requires the evaluation of the likelihood which is intractable in most cases. There are different statistical methods that allow simulating from the model despite intractability of the likelihood. Approximate Bayesian computation is a common approach for tackling inference which relies on simulation of the model to approximate the intractable likelihood. Particle Markov chain Monte Carlo (PMCMC) is another approach which is based on using sequential Monte Carlo to estimate intractable likelihood. However, both methods are computationally expensive. In this paper we discuss the efficiency and possible practical issues for each method, taking into account the computational time for these methods. We demonstrate likelihood-free inference by performing analysing a model of the Repressilator using both methods. Detailed investigation is performed to quantify the difference between these methods in terms of efficiency and computational cost.Keywords: Approximate Bayesian computation(ABC), Continuous-Time Markov Chains, Sequential Monte Carlo, Particle Markov chain Monte Carlo (PMCMC)
Procedia PDF Downloads 2054604 Semiparametric Regression Of Truncated Spline Biresponse On Farmer Loyalty And Attachment Modeling
Authors: Adji Achmad Rinaldo Fernandes
Abstract:
Regression analysis is a statistical method that is able to describe and predict causal relationships between individuals. Not all relationships have a known curve shape; often, there are relationship patterns that cannot be known in the shape of the curve; besides that, a cause can have an impact on more than one effect, so that between effects can also have a close relationship in it. Regression analysis that can be done to find out the relationship can be brought closer to the semiparametric regression of truncated spline biresponse. The purpose of this study is to examine the function estimator and determine the best model of truncated spline biresponse semiparametric regression. The results of the secondary data study showed that the best model with the highest order of quadratic and a maximum of two knots with a Goodness of fit value in the form of Adjusted R2 of 88.5%.Keywords: biresponse, farmer attachment, farmer loyalty, truncated spline
Procedia PDF Downloads 414603 A Framework for Consumer Selection on Travel Destinations
Authors: J. Rhodes, V. Cheng, P. Lok
Abstract:
The aim of this study is to develop a parsimonious model that explains the effect of different stimulus on a tourist’s intention to visit a new destination. The model consists of destination trust and interest as the mediating variables. The model was tested using two different types of stimulus; both studies empirically supported the proposed model. Furthermore, the first study revealed that advertising has a stronger effect than positive online reviews. The second study found that the peripheral route of the elaboration likelihood model has a stronger influence power than the central route in this context.Keywords: advertising, electronic word-of-mouth, elaboration likelihood model, intention to visit, trust
Procedia PDF Downloads 4594602 Flood Hazard Assessment and Land Cover Dynamics of the Orai Khola Watershed, Bardiya, Nepal
Authors: Loonibha Manandhar, Rajendra Bhandari, Kumud Raj Kafle
Abstract:
Nepal’s Terai region is a part of the Ganges river basin which is one of the most disaster-prone areas of the world, with recurrent monsoon flooding causing millions in damage and the death and displacement of hundreds of people and households every year. The vulnerability of human settlements to natural disasters such as floods is increasing, and mapping changes in land use practices and hydro-geological parameters is essential in developing resilient communities and strong disaster management policies. The objective of this study was to develop a flood hazard zonation map of Orai Khola watershed and map the decadal land use/land cover dynamics of the watershed. The watershed area was delineated using SRTM DEM, and LANDSAT images were classified into five land use classes (forest, grassland, sediment and bare land, settlement area and cropland, and water body) using pixel-based semi-automated supervised maximum likelihood classification. Decadal changes in each class were then quantified using spatial modelling. Flood hazard mapping was performed by assigning weights to factors slope, rainfall distribution, distance from the river and land use/land cover on the basis of their estimated influence in causing flood hazard and performing weighed overlay analysis to identify areas that are highly vulnerable. The forest and grassland coverage increased by 11.53 km² (3.8%) and 1.43 km² (0.47%) from 1996 to 2016. The sediment and bare land areas decreased by 12.45 km² (4.12%) from 1996 to 2016 whereas settlement and cropland areas showed a consistent increase to 14.22 km² (4.7%). Waterbody coverage also increased to 0.3 km² (0.09%) from 1996-2016. 1.27% (3.65 km²) of total watershed area was categorized into very low hazard zone, 20.94% (60.31 km²) area into low hazard zone, 37.59% (108.3 km²) area into moderate hazard zone, 29.25% (84.27 km²) area into high hazard zone and 31 villages which comprised 10.95% (31.55 km²) were categorized into high hazard zone area.Keywords: flood hazard, land use/land cover, Orai river, supervised maximum likelihood classification, weighed overlay analysis
Procedia PDF Downloads 3544601 A Semi-Markov Chain-Based Model for the Prediction of Deterioration of Concrete Bridges in Quebec
Authors: Eslam Mohammed Abdelkader, Mohamed Marzouk, Tarek Zayed
Abstract:
Infrastructure systems are crucial to every aspect of life on Earth. Existing Infrastructure is subjected to degradation while the demands are growing for a better infrastructure system in response to the high standards of safety, health, population growth, and environmental protection. Bridges play a crucial role in urban transportation networks. Moreover, they are subjected to high level of deterioration because of the variable traffic loading, extreme weather conditions, cycles of freeze and thaw, etc. The development of Bridge Management Systems (BMSs) has become a fundamental imperative nowadays especially in the large transportation networks due to the huge variance between the need for maintenance actions, and the available funds to perform such actions. Deterioration models represent a very important aspect for the effective use of BMSs. This paper presents a probabilistic time-based model that is capable of predicting the condition ratings of the concrete bridge decks along its service life. The deterioration process of the concrete bridge decks is modeled using semi-Markov process. One of the main challenges of the Markov Chain Decision Process (MCDP) is the construction of the transition probability matrix. Yet, the proposed model overcomes this issue by modeling the sojourn times based on some probability density functions. The sojourn times of each condition state are fitted to probability density functions based on some goodness of fit tests such as Kolmogorov-Smirnov test, Anderson Darling, and chi-squared test. The parameters of the probability density functions are obtained using maximum likelihood estimation (MLE). The condition ratings obtained from the Ministry of Transportation in Quebec (MTQ) are utilized as a database to construct the deterioration model. Finally, a comparison is conducted between the Markov Chain and semi-Markov chain to select the most feasible prediction model.Keywords: bridge management system, bridge decks, deterioration model, Semi-Markov chain, sojourn times, maximum likelihood estimation
Procedia PDF Downloads 2154600 Segmentation of Piecewise Polynomial Regression Model by Using Reversible Jump MCMC Algorithm
Authors: Suparman
Abstract:
Piecewise polynomial regression model is very flexible model for modeling the data. If the piecewise polynomial regression model is matched against the data, its parameters are not generally known. This paper studies the parameter estimation problem of piecewise polynomial regression model. The method which is used to estimate the parameters of the piecewise polynomial regression model is Bayesian method. Unfortunately, the Bayes estimator cannot be found analytically. Reversible jump MCMC algorithm is proposed to solve this problem. Reversible jump MCMC algorithm generates the Markov chain that converges to the limit distribution of the posterior distribution of piecewise polynomial regression model parameter. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of piecewise polynomial regression model.Keywords: piecewise regression, bayesian, reversible jump MCMC, segmentation
Procedia PDF Downloads 3734599 Evidence on the Impact of Corporate Governance on Bank Performance from Deposit Money Banks in Sub-Saharan Africa
Authors: Ayotunde Qudus Saka, Xin Zhang
Abstract:
Purpose: The purpose of this study is to investigate how corporate governance traits affect the financial performance of banks in the sub-Saharan African region from 2008 to 2022. Methodology/Design/Approach: The performance of a few chosen banks in Sub-Saharan Africa is examined in relation to corporate governance using static panel regression analysis. The following variables were used to present corporate governance in the study: board size (BDS), board gender diversity (BGD), board independence (BDI), number of audit committee meetings (NAM), and number of foreign members on the board (SFM). Return on assets (ROA) was employed as the dependent variable. Fixed effect (FE), random effect (RE), and common effect (CE) estimators were used with static panel data. The model estimate procedure is based on the 'Log-Lin' specification. The estimation includes eleven (11) models, ten of which relate to the individual country and one that captures the SSA countries used in this study. Finding: The RE effect estimator seems to be more efficient than the FE estimator overall. Therefore, the selected estimator used for the overall country investigation is the random effect model adopted in evaluating the connection between financial success and corporate governance, and according to the all-country specification in assessing the study's goal, the fixed effect estimator is thus selected for most of the countries except for Malawi and Zambia that common effect model worked well for showing that the banks in the aforementioned countries have similar organisational cultures and management philosophies. Consequently, the selected estimators for every country were used to evaluate the connection between financial success and corporate governance. Originality/Value: Corporate governance and bank performance topics are well grounded in literature with evidence from developed countries. However, there is a darth in developing countries particularly in the sub-Saharan African region. This study presents multi-country empirical evidence within the SSAs which gives the study more samples, this study makes use of balance data from 2008 to 2022 being the latest data coverage from SSA, and no prior research has examined the impact of corporate governance mechanisms on bank performance in the SSA region through the use of multi-country samples.Keywords: bank performance, corporate governance, sub-Saharan African (SSA), gender diversity, foreign member of the board, multi-country
Procedia PDF Downloads 24598 Reduced Complexity of ML Detection Combined with DFE
Authors: Jae-Hyun Ro, Yong-Jun Kim, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, many detection schemes have been developed to improve the error performance and to reduce the complexity. Maximum likelihood (ML) detection has optimal error performance but it has very high complexity. Thus, this paper proposes reduced complexity of ML detection combined with decision feedback equalizer (DFE). The error performance of the proposed detection scheme is higher than the conventional DFE. But the complexity of the proposed scheme is lower than the conventional ML detection.Keywords: detection, DFE, MIMO-OFDM, ML
Procedia PDF Downloads 6104597 Probabilistic Modeling of Post-Liquefaction Ground Deformation
Authors: Javad Sadoghi Yazdi, Robb Eric S. Moss
Abstract:
This paper utilizes a probabilistic liquefaction triggering method for modeling post-liquefaction ground deformation. This cone penetration test CPT-based liquefaction triggering is employed to estimate the factor of safety against liquefaction (FSL) and compute the maximum cyclic shear strain (γmax). The study identifies a maximum PL value of 90% across various relative densities, which challenges the decrease from 90% to 70% as relative density decreases. It reveals that PL ranges from 5% to 50% for volumetric strain (εvol) less than 1%, while for εvol values between 1% and 3.2%, PL spans from 50% to 90%. The application of the CPT-based simplified liquefaction triggering procedures has been employed in previous researches to estimate liquefaction ground-failure indices, such as the Liquefaction Potential Index (LPI) and Liquefaction Severity Number (LSN). However, several studies have been conducted to highlight the variability in liquefaction probability calculations, suggesting a more accurate depiction of liquefaction likelihood. Consequently, the utilization of these simplified methods may not offer practical efficiency. This paper further investigates the efficacy of various established liquefaction vulnerability parameters, including LPI and LSN, in explaining the observed liquefaction-induced damage within residential zones of Christchurch, New Zealand using results from CPT database.Keywords: cone penetration test (CPT), liquefaction, postliquefaction, ground failure
Procedia PDF Downloads 724596 A Learning-Based EM Mixture Regression Algorithm
Authors: Yi-Cheng Tian, Miin-Shen Yang
Abstract:
The mixture likelihood approach to clustering is a popular clustering method where the expectation and maximization (EM) algorithm is the most used mixture likelihood method. In the literature, the EM algorithm had been used for mixture regression models. However, these EM mixture regression algorithms are sensitive to initial values with a priori number of clusters. In this paper, to resolve these drawbacks, we construct a learning-based schema for the EM mixture regression algorithm such that it is free of initializations and can automatically obtain an approximately optimal number of clusters. Some numerical examples and comparisons demonstrate the superiority and usefulness of the proposed learning-based EM mixture regression algorithm.Keywords: clustering, EM algorithm, Gaussian mixture model, mixture regression model
Procedia PDF Downloads 5104595 On Parameter Estimation of Simultaneous Linear Functional Relationship Model for Circular Variables
Authors: N. A. Mokhtar, A. G. Hussin, Y. Z. Zubairi
Abstract:
This paper proposes a new simultaneous simple linear functional relationship model by assuming equal error variances. We derive the maximum likelihood estimate of the parameters in the simultaneous model and the covariance. We show by simulation study the small bias values of the parameters suggest the suitability of the estimation method. As an illustration, the proposed simultaneous model is applied to real data of the wind direction and wave direction measured by two different instruments.Keywords: simultaneous linear functional relationship model, Fisher information matrix, parameter estimation, circular variables
Procedia PDF Downloads 3674594 A Nonlinear Dynamical System with Application
Authors: Abdullah Eqal Al Mazrooei
Abstract:
In this paper, a nonlinear dynamical system is presented. This system is a bilinear class. The bilinear systems are very important kind of nonlinear systems because they have many applications in real life. They are used in biology, chemistry, manufacturing, engineering, and economics where linear models are ineffective or inadequate. They have also been recently used to analyze and forecast weather conditions. Bilinear systems have three advantages: First, they define many problems which have a great applied importance. Second, they give us approximations to nonlinear systems. Thirdly, they have a rich geometric and algebraic structures, which promises to be a fruitful field of research for scientists and applications. The type of nonlinearity that is treated and analyzed consists of bilinear interaction between the states vectors and the system input. By using some properties of the tensor product, these systems can be transformed to linear systems. But, here we discuss the nonlinearity when the state vector is multiplied by itself. So, this model will be able to handle evolutions according to the Lotka-Volterra models or the Lorenz weather models, thus enabling a wider and more flexible application of such models. Here we apply by using an estimator to estimate temperatures. The results prove the efficiency of the proposed system.Keywords: Lorenz models, nonlinear systems, nonlinear estimator, state-space model
Procedia PDF Downloads 2544593 A Novel Software Model for Enhancement of System Performance and Security through an Optimal Placement of PMU and FACTS
Authors: R. Kiran, B. R. Lakshmikantha, R. V. Parimala
Abstract:
Secure operation of power systems requires monitoring of the system operating conditions. Phasor measurement units (PMU) are the device, which uses synchronized signals from the GPS satellites, and provide the phasors information of voltage and currents at a given substation. The optimal locations for the PMUs must be determined, in order to avoid redundant use of PMUs. The objective of this paper is to make system observable by using minimum number of PMUs & the implementation of stability software at 22OkV grid for on-line estimation of the power system transfer capability based on voltage and thermal limitations and for security monitoring. This software utilizes State Estimator (SE) and synchrophasor PMU data sets for determining the power system operational margin under normal and contingency conditions. This software improves security of transmission system by continuously monitoring operational margin expressed in MW or in bus voltage angles, and alarms the operator if the margin violates a pre-defined threshold.Keywords: state estimator (SE), flexible ac transmission systems (FACTS), optimal location, phasor measurement units (PMU)
Procedia PDF Downloads 4114592 High Altitude Glacier Surface Mapping in Dhauliganga Basin of Himalayan Environment Using Remote Sensing Technique
Authors: Aayushi Pandey, Manoj Kumar Pandey, Ashutosh Tiwari, Kireet Kumar
Abstract:
Glaciers play an important role in climate change and are sensitive phenomena of global climate change scenario. Glaciers in Himalayas are unique as they are predominantly valley type and are located in tropical, high altitude regions. These glaciers are often covered with debris which greatly affects ablation rate of glaciers and work as a sensitive indicator of glacier health. The aim of this study is to map high altitude Glacier surface with a focus on glacial lake and debris estimation using different techniques in Nagling glacier of dhauliganga basin in Himalayan region. Different Image Classification techniques i.e. thresholding on different band ratios and supervised classification using maximum likelihood classifier (MLC) have been used on high resolution sentinel 2A level 1c satellite imagery of 14 October 2017.Here Near Infrared (NIR)/Shortwave Infrared (SWIR) ratio image was used to extract the glaciated classes (Snow, Ice, Ice Mixed Debris) from other non-glaciated terrain classes. SWIR/BLUE Ratio Image was used to map valley rock and Debris while Green/NIR ratio image was found most suitable for mapping Glacial Lake. Accuracy assessment was performed using high resolution (3 meters) Planetscope Imagery using 60 stratified random points. The overall accuracy of MLC was 85 % while the accuracy of Band Ratios was 96.66 %. According to Band Ratio technique total areal extent of glaciated classes (Snow, Ice ,IMD) in Nagling glacier was 10.70 km2 nearly 38.07% of study area comprising of 30.87 % Snow covered area, 3.93% Ice and 3.27 % IMD covered area. Non-glaciated classes (vegetation, glacial lake, debris and valley rock) covered 61.93 % of the total area out of which valley rock is dominant with 33.83% coverage followed by debris covering 27.7 % of the area in nagling glacier. Glacial lake and Debris were accurately mapped using Band ratio technique Hence, Band Ratio approach appears to be useful for the mapping of debris covered glacier in Himalayan Region.Keywords: band ratio, Dhauliganga basin, glacier mapping, Himalayan region, maximum likelihood classifier (MLC), Sentinel-2 satellite image
Procedia PDF Downloads 2304591 Weighted Rank Regression with Adaptive Penalty Function
Authors: Kang-Mo Jung
Abstract:
The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression
Procedia PDF Downloads 4774590 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function
Procedia PDF Downloads 3094589 Median-Based Nonparametric Estimation of Returns in Mean-Downside Risk Portfolio Frontier
Authors: H. Ben Salah, A. Gannoun, C. de Peretti, A. Trabelsi
Abstract:
The Downside Risk (DSR) model for portfolio optimisation allows to overcome the drawbacks of the classical mean-variance model concerning the asymetry of returns and the risk perception of investors. This model optimization deals with a positive definite matrix that is endogenous with respect to portfolio weights. This aspect makes the problem far more difficult to handle. For this purpose, Athayde (2001) developped a new recurcive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier presents an appearance which is not very smooth. In order to overcome that, Athayde (2003) proposed a mean kernel estimation of the returns, so as to create a smoother portfolio frontier. This technique provides an effect similar to the case in which we had continuous observations. In this paper, taking advantage on the the robustness of the median, we replace the mean estimator in Athayde's model by a nonparametric median estimator of the returns. Then, we give a new version of the former algorithm (of Athayde (2001, 2003)). We eventually analyse the properties of this improved portfolio frontier and apply this new method on real examples.Keywords: Downside Risk, Kernel Method, Median, Nonparametric Estimation, Semivariance
Procedia PDF Downloads 4934588 Environmentally Adaptive Acoustic Echo Suppression for Barge-in Speech Recognition
Authors: Jong Han Joo, Jung Hoon Lee, Young Sun Kim, Jae Young Kang, Seung Ho Choi
Abstract:
In this study, we propose a novel technique for acoustic echo suppression (AES) during speech recognition under barge-in conditions. Conventional AES methods based on spectral subtraction apply fixed weights to the estimated echo path transfer function (EPTF) at the current signal segment and to the EPTF estimated until the previous time interval. We propose a new approach that adaptively updates weight parameters in response to abrupt changes in the acoustic environment due to background noises or double-talk. Furthermore, we devised a voice activity detector and an initial time-delay estimator for barge-in speech recognition in communication networks. The initial time delay is estimated using log-spectral distance measure, as well as cross-correlation coefficients. The experimental results show that the developed techniques can be successfully applied in barge-in speech recognition systems.Keywords: acoustic echo suppression, barge-in, speech recognition, echo path transfer function, initial delay estimator, voice activity detector
Procedia PDF Downloads 3734587 The Normal-Generalized Hyperbolic Secant Distribution: Properties and Applications
Authors: Hazem M. Al-Mofleh
Abstract:
In this paper, a new four-parameter univariate continuous distribution called the Normal-Generalized Hyperbolic Secant Distribution (NGHS) is defined and studied. Some general and structural distributional properties are investigated and discussed, including: central and non-central n-th moments and incomplete moments, quantile and generating functions, hazard function, Rényi and Shannon entropies, shapes: skewed right, skewed left, and symmetric, modality regions: unimodal and bimodal, maximum likelihood (MLE) estimators for the parameters. Finally, two real data sets are used to demonstrate empirically its flexibility and prove the strength of the new distribution.Keywords: bimodality, estimation, hazard function, moments, Shannon’s entropy
Procedia PDF Downloads 3514586 Power Efficiency Characteristics of Magnetohydrodynamic Thermodynamic Gas Cycle
Authors: Mahmoud Huleihil
Abstract:
In this study, the performance of a thermodynamic gas cycle of magnetohydrodynamic (MHD) power generation is considered and presented in terms of power efficiency curves. The dissipation mechanisms considered include: fluid friction modeled by means of the isentropic efficiency of the compressor, heat transfer leakage directly from the hot reservoir to the cold heat reservoir, and constant velocity of the MHD generator. The study demonstrates that power and efficiency vanish at the extremes of both slow and fast operating conditions. These points are demonstrated on power efficiency curves and the locus of efficiency at maximum power and the locus of maximum efficiency. Qualitatively, the considered loss mechanisms have a similar effect on the efficiency at maximum power operation and on maximum efficiency operation, thus these efficiencies are reduced, even for small values of the loss mechanisms.Keywords: magnetohydrodynamic generator, electrical efficiency, maximum power, maximum efficiency, heat engine
Procedia PDF Downloads 2474585 Credit Risk Prediction Based on Bayesian Estimation of Logistic Regression Model with Random Effects
Authors: Sami Mestiri, Abdeljelil Farhat
Abstract:
The aim of this current paper is to predict the credit risk of banks in Tunisia, over the period (2000-2005). For this purpose, two methods for the estimation of the logistic regression model with random effects: Penalized Quasi Likelihood (PQL) method and Gibbs Sampler algorithm are applied. By using the information on a sample of 528 Tunisian firms and 26 financial ratios, we show that Bayesian approach improves the quality of model predictions in terms of good classification as well as by the ROC curve result.Keywords: forecasting, credit risk, Penalized Quasi Likelihood, Gibbs Sampler, logistic regression with random effects, curve ROC
Procedia PDF Downloads 5424584 Home Range and Spatial Interaction Modelling of Black Bears
Authors: Fekadu L. Bayisa, Elvan Ceyhan, Todd D. Steury
Abstract:
Interaction between individuals within the same species is an important component of population dynamics. An interaction can be either static (based on spatial overlap) or dynamic (based on movement interactions). Using GPS collar data, we can quantify both static and dynamic interactions between black bears. The goal of this work is to determine the level of black bear interactions using the 95% and 50% home ranges, as well as to model black bear spatial interactions, which could be attraction, avoidance/repulsion, or a lack of interaction at all, to gain new insights and improve our understanding of ecological processes. Recent methodological developments in home range estimation, inhomogeneous multitype/cross-type summary statistics, and envelope testing methods are explored to study the nature of black bear interactions. Our findings, in general, indicate that the black bears of one type in our data set tend to cluster around another type.Keywords: autocorrelated kernel density estimator, cross-type summary function, inhomogeneous multitype Poisson process, kernel density estimator, minimum convex polygon, pointwise and global envelope tests
Procedia PDF Downloads 824583 Phylogenetic Relationships of Common Reef Fish Species in Vietnam
Authors: Dang Thuy Binh, Truong Thi Oanh, Le Phan Khanh Hung, Luong thi Tuong Vy
Abstract:
One of the greatest environmental challenges facing Asia is the management and conservation of the marine biodiversity threaten by fisheries overexploitation, pollution, habitat destruction, and climate change. To date, a few molecular taxonomical studies has been conducted on marine fauna in Vietnam. The purpose of this study was to clarify the phylogeny of economic and ecological reef fish species in Vietnam Reef fish species covering Labridae, Scaridae, Nemipteridae, Serranidae, Acanthuridae, Lutjanidae, Lethrinidae, Mullidae, Balistidae, Pseudochromidae, Pinguipedidae, Fistulariidae, Holocentridae, Synodontidae, and Pomacentridae representing 28 genera were collected from South and Center, Vietnam. Combine with Genbank sequences, a phylogenetic tree was constructed based on 16S gene of mitochondrial DNA using maximum parsimony, maximum likelihood, and Bayesian inference approaches. The phylogram showed the well-resolved clades at genus and family level. Perciformes is the major order of reef fish species in Vietnam. The monophyly of Perciformes is not strongly supported as it was clustered in the same clade with Tetraodontiformes syngnathiformes and Beryciformes. Continue sampling of commercial fish species and classification based on morphology and genetics to build DNA barcoding of fish species in Vietnam is really necessary.Keywords: reef fish, 16s rDNA, Vietnam, phylogeny
Procedia PDF Downloads 4394582 Optimization Analysis of Controlled Cooling Process for H-Shape Steam Beams
Authors: Jiin-Yuh Jang, Yu-Feng Gan
Abstract:
In order to improve the comprehensive mechanical properties of the steel, the cooling rate, and the temperature distribution must be controlled in the cooling process. A three-dimensional numerical model for the prediction of the heat transfer coefficient distribution of H-beam in the controlled cooling process was performed in order to obtain the uniform temperature distribution and minimize the maximum stress and the maximum deformation after the controlled cooling. An algorithm developed with a simplified conjugated-gradient method was used as an optimizer to optimize the heat transfer coefficient distribution. The numerical results showed that, for the case of air cooling 5 seconds followed by water cooling 6 seconds with uniform the heat transfer coefficient, the cooling rate is 15.5 (℃/s), the maximum temperature difference is 85℃, the maximum the stress is 125 MPa, and the maximum deformation is 1.280 mm. After optimize the heat transfer coefficient distribution in control cooling process with the same cooling time, the cooling rate is increased to 20.5 (℃/s), the maximum temperature difference is decreased to 52℃, the maximum stress is decreased to 82MPa and the maximum deformation is decreased to 1.167mm.Keywords: controlled cooling, H-Beam, optimization, thermal stress
Procedia PDF Downloads 3714581 The Trade Flow of Small Association Agreements When Rules of Origin Are Relaxed
Authors: Esmat Kamel
Abstract:
This paper aims to shed light on the extent to which the Agadir Association agreement has fostered inter regional trade between the E.U_26 and the Agadir_4 countries; once that we control for the evolution of Agadir agreement’s exports to the rest of the world. The next valid question will be regarding any remarkable variation in the spatial/sectoral structure of exports, and to what extent has it been induced by the Agadir agreement itself and precisely after the adoption of rules of origin and the PANEURO diagonal cumulative scheme? The paper’s empirical dataset covering a timeframe from [2000 -2009] was designed to account for sector specific export and intermediate flows and the bilateral structured gravity model was custom tailored to capture sector and regime specific rules of origin and the Poisson Pseudo Maximum Likelihood Estimator was used to calculate the gravity equation. The methodological approach of this work is considered to be a threefold one which starts first by conducting a ‘Hierarchal Cluster Analysis’ to classify final export flows showing a certain degree of linkage between each other. The analysis resulted in three main sectoral clusters of exports between Agadir_4 and E.U_26: cluster 1 for Petrochemical related sectors, cluster 2 durable goods and finally cluster 3 for heavy duty machinery and spare parts sectors. Second step continues by taking export flows resulting from the 3 clusters to be subject to treatment with diagonal Rules of origin through ‘The Double Differences Approach’, versus an equally comparable untreated control group. Third step is to verify results through a robustness check applied by ‘Propensity Score Matching’ to validate that the same sectoral final export and intermediate flows increased when rules of origin were relaxed. Through all the previous analysis, a remarkable and partial significance of the interaction term combining both treatment effects and time for the coefficients of 13 out of the 17 covered sectors turned out to be partially significant and it further asserted that treatment with diagonal rules of origin contributed in increasing Agadir’s_4 final and intermediate exports to the E.U._26 on average by 335% and in changing Agadir_4 exports structure and composition to the E.U._26 countries.Keywords: agadir association agreement, structured gravity model, hierarchal cluster analysis, double differences estimation, propensity score matching, diagonal and relaxed rules of origin
Procedia PDF Downloads 3194580 Carbohydrate Intake Estimation in Type I Diabetic Patients Described by UVA/Padova Model
Authors: David A. Padilla, Rodolfo Villamizar
Abstract:
In recent years, closed loop control strategies have been developed in order to establish a healthy glucose profile in type 1 diabetic mellitus (T1DM) patients. However, the controller itself is unable to define a suitable reference trajectory for glucose. In this paper, a control strategy Is proposed where the shape of the reference trajectory is generated bases in the amount of carbohydrates present during the digestive process, due to the effect of carbohydrate intake. Since there no exists a sensor to measure the amount of carbohydrates consumed, an estimator is proposed. Thus this paper presents the entire process of designing a carbohydrate estimator, which allows estimate disturbance for a predictive controller (MPC) in a T1MD patient, the estimation will be used to establish a profile of reference and improve the response of the controller by providing the estimated information of ingested carbohydrates. The dynamics of the diabetic model used are due to the equations described by the UVA/Padova model of the T1DMS simulator, the system was developed and simulated in Simulink, taking into account the noise and limitations of the glucose control system actuators.Keywords: estimation, glucose control, predictive controller, MPC, UVA/Padova
Procedia PDF Downloads 2624579 A Comparative Study of Additive and Nonparametric Regression Estimators and Variable Selection Procedures
Authors: Adriano Z. Zambom, Preethi Ravikumar
Abstract:
One of the biggest challenges in nonparametric regression is the curse of dimensionality. Additive models are known to overcome this problem by estimating only the individual additive effects of each covariate. However, if the model is misspecified, the accuracy of the estimator compared to the fully nonparametric one is unknown. In this work the efficiency of completely nonparametric regression estimators such as the Loess is compared to the estimators that assume additivity in several situations, including additive and non-additive regression scenarios. The comparison is done by computing the oracle mean square error of the estimators with regards to the true nonparametric regression function. Then, a backward elimination selection procedure based on the Akaike Information Criteria is proposed, which is computed from either the additive or the nonparametric model. Simulations show that if the additive model is misspecified, the percentage of time it fails to select important variables can be higher than that of the fully nonparametric approach. A dimension reduction step is included when nonparametric estimator cannot be computed due to the curse of dimensionality. Finally, the Boston housing dataset is analyzed using the proposed backward elimination procedure and the selected variables are identified.Keywords: additive model, nonparametric regression, variable selection, Akaike Information Criteria
Procedia PDF Downloads 266