Search results for: elaboration likelihood model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16738

Search results for: elaboration likelihood model

16648 A Spatial Approach to Model Mortality Rates

Authors: Yin-Yee Leong, Jack C. Yue, Hsin-Chung Wang

Abstract:

Human longevity has been experiencing its largest increase since the end of World War II, and modeling the mortality rates is therefore often the focus of many studies. Among all mortality models, the Lee–Carter model is the most popular approach since it is fairly easy to use and has good accuracy in predicting mortality rates (e.g., for Japan and the USA). However, empirical studies from several countries have shown that the age parameters of the Lee–Carter model are not constant in time. Many modifications of the Lee–Carter model have been proposed to deal with this problem, including adding an extra cohort effect and adding another period effect. In this study, we propose a spatial modification and use clusters to explain why the age parameters of the Lee–Carter model are not constant. In spatial analysis, clusters are areas with unusually high or low mortality rates than their neighbors, where the “location” of mortality rates is measured by age and time, that is, a 2-dimensional coordinate. We use a popular cluster detection method—Spatial scan statistics, a local statistical test based on the likelihood ratio test to evaluate where there are locations with mortality rates that cannot be described well by the Lee–Carter model. We first use computer simulation to demonstrate that the cluster effect is a possible source causing the problem of the age parameters not being constant. Next, we show that adding the cluster effect can solve the non-constant problem. We also apply the proposed approach to mortality data from Japan, France, the USA, and Taiwan. The empirical results show that our approach has better-fitting results and smaller mean absolute percentage errors than the Lee–Carter model.

Keywords: mortality improvement, Lee–Carter model, spatial statistics, cluster detection

Procedia PDF Downloads 142
16647 Governance and Public Policy: The Perception of Civil Society Participation in Brazil and South Africa

Authors: Paulino V. Tavares, Ana L. Romao

Abstract:

Public governance, in general, is essential to qualify and educate, pedagogically, the decision-making process of the government in relation to the management of resources and the provision of public services, with transparency and active participation of individuals and citizens for the development of a more democratic environment, besides stimulating control and social empowerment, aiming at the development of the collectivity. In this context, the participation of society in the elaboration, execution, and control of public policies is prominent to strengthen public governance itself. With this, using a multidimensional approach with the application of two questionnaires to a universe of twenty Counselors of the Courts of Auditors (Brazil), twenty professionals of public administration (Brazil), twenty Government/Provincial Counselors (South Africa), and twenty South African professionals of public administration, the preliminary results indicate that the participation of civil society, for both countries, is very low in the elaboration, execution, and control of public policies. At the same time, about 70% of the answers obtained indicate, on average, three possible paths to increase the participation of civil society. With this, it is delineated that developing new horizons to strengthen both public policies how social participation is necessary, but, for both, it is important that governments and civil society, in their respective countries, have an awareness of the effective importance of this interaction.

Keywords: Brazil, civil society, participation, South Africa

Procedia PDF Downloads 111
16646 Elaboration and Characterization of Self-Compacting Mortar Based Biopolymer

Authors: I. Djefour, M. Saidi, I. Tlemsani, S. Toubal

Abstract:

Lignin is a molecule derived from wood and also generated as waste from the paper industry. With a view to its valorization and protection of the environment, we are interested in its use as a superplasticizer-type adjuvant in mortars and concretes to improve their mechanical strengths. The additives of the concrete have a very strong influence on the properties of the fresh and / or hardened concrete. This study examines the development and use of industrial waste and lignin extracted from a renewable natural source (wood) in cementitious materials. The use of these resources is known at present as a definite resurgence of interest in the development of building materials. Physicomechanical characteristics of mortars are determined by optimization quantity of the natural superplasticizer. The results show that the mechanical strengths of mortars based on natural adjuvant have improved by 20% (64 MPa) for a W/C ratio = 0.4, and the amount of natural adjuvant of dry extract needed is 40 times smaller than commercial adjuvant. This study has a scientific impact (improving the performance of the mortar with an increase in compactness and reduction of the quantity of water), ecological use of the lignin waste generated by the paper industry) and economic reduction of the cost price necessary to elaboration of self-compacting mortars and concretes).

Keywords: biopolymer (lignin), industrial waste, mechanical resistances, self compacting mortars (SCM)

Procedia PDF Downloads 134
16645 Modelling Structural Breaks in Stock Price Time Series Using Stochastic Differential Equations

Authors: Daniil Karzanov

Abstract:

This paper studies the effect of quarterly earnings reports on the stock price. The profitability of the stock is modeled by geometric Brownian diffusion and the Constant Elasticity of Variance model. We fit several variations of stochastic differential equations to the pre-and after-report period using the Maximum Likelihood Estimation and Grid Search of parameters method. By examining the change in the model parameters after reports’ publication, the study reveals that the reports have enough evidence to be a structural breakpoint, meaning that all the forecast models exploited are not applicable for forecasting and should be refitted shortly.

Keywords: stock market, earnings reports, financial time series, structural breaks, stochastic differential equations

Procedia PDF Downloads 161
16644 The Multifunctional Medical Centers’ Architectural Shaping

Authors: Griaznova Svetlana, Umedov Mekhroz

Abstract:

The current healthcare facilities trend is the creation of multidisciplinary large-scale centers to provide the maximum possible services in one place, minimizing the number of possible instances in the path of patient treatment. The multifunctional medical centers are mainly designed in urban infrastructure for good accessibility. However, many functions and connections define the building shape, often make it inharmonious, that greatly destroys the city's appearance. The purpose of the research is to scientifically substantiate the factors influencing the shaping, the formation of architectural solutions principles, the formation of recommendations and principles for the multifunctional medical centers' design. The result of the research is the elaboration of architectural and planning solutions principles and the determination of factors affecting the multifunctional healthcare facilities shaping. Research method: Study and generalization of international experience in scientific research, literature, standards, teaching aids, and design materials on the topic of research. An integrated approach to the study of existing international experience of multidisciplinary medical centers. Elaboration of graphical analysis and diagrams based on the system analysis of the processed information. Identification of methods and principles of functional zoning of nuclear medicine centers.

Keywords: health care, multifunctionality, form, medical center, hospital, PET, CT scan

Procedia PDF Downloads 70
16643 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring

Authors: Daniel Fundi Murithi

Abstract:

Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.

Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring

Procedia PDF Downloads 126
16642 Predictors of the Self-Reported Likelihood of Seeking Social Worker Help among People with Physical Disabilities

Authors: Maya Kagan, Michal Itzick, Patricia Tal-Katz

Abstract:

Social workers hold a variety of roles and practices, and one of these involves the care, treatment, and rehabilitation of disabled people. The current study assesses the association between demographic factors, attitudes towards social workers, the stigma attached to seeking social worker help, perceived social support, and psychological distress - and the self-reported likelihood of seeking social worker help, among people with physical disabilities (PWPD) in Israel. Data collection utilized structured questionnaires, administered to a sample of 435 PWPD. Statistical analyses were done using SPSS software. The findings suggest that women, older respondents, people with more positive attitudes towards social workers, with higher levels of psychological distress and of social support, and with a lower level of stigma, reported a greater likelihood of seeking social worker help. The study's conclusion is that there are certain avoidance factors among PWPD that might discourage them from seeking professional social worker help. Therefore, it is important that social workers identify these factors and develop interventions aimed at encouraging PWPD to seek professional social worker help in case of need, and also develop practices adjusted to PWPD's unique needs.

Keywords: attitudes towards social workers, people with physical disabilities, perceived social support, psychological distress, seeking help, stigma

Procedia PDF Downloads 305
16641 Parametric Modeling for Survival Data with Competing Risks Using the Generalized Gompertz Distribution

Authors: Noora Al-Shanfari, M. Mazharul Islam

Abstract:

The cumulative incidence function (CIF) is a fundamental approach for analyzing survival data in the presence of competing risks, which estimates the marginal probability for each competing event. Parametric modeling of CIF has the advantage of fitting various shapes of CIF and estimates the impact of covariates with maximum efficiency. To calculate the total CIF's covariate influence using a parametric model., it is essential to parametrize the baseline of the CIF. As the CIF is an improper function by nature, it is necessary to utilize an improper distribution when applying parametric models. The Gompertz distribution, which is an improper distribution, is limited in its applicability as it only accounts for monotone hazard shapes. The generalized Gompertz distribution, however, can adapt to a wider range of hazard shapes, including unimodal, bathtub, and monotonic increasing or decreasing hazard shapes. In this paper, the generalized Gompertz distribution is used to parametrize the baseline of the CIF, and the parameters of the proposed model are estimated using the maximum likelihood approach. The proposed model is compared with the existing Gompertz model using the Akaike information criterion. Appropriate statistical test procedures and model-fitting criteria will be used to test the adequacy of the model. Both models are applied to the ‘colon’ dataset, which is available in the “biostat3” package in R.

Keywords: competing risks, cumulative incidence function, improper distribution, parametric modeling, survival analysis

Procedia PDF Downloads 53
16640 ML-Based Blind Frequency Offset Estimation Schemes for OFDM Systems in Non-Gaussian Noise Environments

Authors: Keunhong Chae, Seokho Yoon

Abstract:

This paper proposes frequency offset (FO) estimation schemes robust to the non-Gaussian noise for orthogonal frequency division multiplexing (OFDM) systems. A maximum-likelihood (ML) scheme and a low-complexity estimation scheme are proposed by applying the probability density function of the cyclic prefix of OFDM symbols to the ML criterion. From simulation results, it is confirmed that the proposed schemes offer a significant FO estimation performance improvement over the conventional estimation scheme in non-Gaussian noise environments.

Keywords: frequency offset, cyclic prefix, maximum-likelihood, non-Gaussian noise, OFDM

Procedia PDF Downloads 440
16639 Extreme Value Modelling of Ghana Stock Exchange Indices

Authors: Kwabena Asare, Ezekiel N. N. Nortey, Felix O. Mettle

Abstract:

Modelling of extreme events has always been of interest in fields such as hydrology and meteorology. However, after the recent global financial crises, appropriate models for modelling of such rare events leading to these crises have become quite essential in the finance and risk management fields. This paper models the extreme values of the Ghana Stock Exchange All-Shares indices (2000-2010) by applying the Extreme Value Theory to fit a model to the tails of the daily stock returns data. A conditional approach of the EVT was preferred and hence an ARMA-GARCH model was fitted to the data to correct for the effects of autocorrelation and conditional heteroscedastic terms present in the returns series, before EVT method was applied. The Peak Over Threshold (POT) approach of the EVT, which fits a Generalized Pareto Distribution (GPD) model to excesses above a certain selected threshold, was employed. Maximum likelihood estimates of the model parameters were obtained and the model’s goodness of fit was assessed graphically using Q-Q, P-P and density plots. The findings indicate that the GPD provides an adequate fit to the data of excesses. The size of the extreme daily Ghanaian stock market movements were then computed using the Value at Risk (VaR) and Expected Shortfall (ES) risk measures at some high quantiles, based on the fitted GPD model.

Keywords: extreme value theory, expected shortfall, generalized pareto distribution, peak over threshold, value at risk

Procedia PDF Downloads 512
16638 On Musical Information Geometry with Applications to Sonified Image Analysis

Authors: Shannon Steinmetz, Ellen Gethner

Abstract:

In this paper, a theoretical foundation is developed for patterned segmentation of audio using the geometry of music and statistical manifold. We demonstrate image content clustering using conic space sonification. The algorithm takes a geodesic curve as a model estimator of the three-parameter Gamma distribution. The random variable is parameterized by musical centricity and centric velocity. Model parameters predict audio segmentation in the form of duration and frame count based on the likelihood of musical geometry transition. We provide an example using a database of randomly selected images, resulting in statistically significant clusters of similar image content.

Keywords: sonification, musical information geometry, image, content extraction, automated quantification, audio segmentation, pattern recognition

Procedia PDF Downloads 178
16637 An Empirical Study of the Best Fitting Probability Distributions for Stock Returns Modeling

Authors: Jayanta Pokharel, Gokarna Aryal, Netra Kanaal, Chris Tsokos

Abstract:

Investment in stocks and shares aims to seek potential gains while weighing the risk of future needs, such as retirement, children's education etc. Analysis of the behavior of the stock market returns and making prediction is important for investors to mitigate risk on investment. Historically, the normal variance models have been used to describe the behavior of stock market returns. However, the returns of the financial assets are actually skewed with higher kurtosis, heavier tails, and a higher center than the normal distribution. The Laplace distribution and its family are natural candidates for modeling stock returns. The Variance-Gamma (VG) distribution is the most sought-after distributions for modeling asset returns and has been extensively discussed in financial literatures. In this paper, it explore the other Laplace family, such as Asymmetric Laplace, Skewed Laplace, Kumaraswamy Laplace (KS) together with Variance-Gamma to model the weekly returns of the S&P 500 Index and it's eleven business sector indices. The method of maximum likelihood is employed to estimate the parameters of the distributions and our empirical inquiry shows that the Kumaraswamy Laplace distribution performs much better for stock returns modeling among the choice of distributions used in this study and in practice, KS can be used as a strong alternative to VG distribution.

Keywords: stock returns, variance-gamma, kumaraswamy laplace, maximum likelihood

Procedia PDF Downloads 35
16636 Identifying Protein-Coding and Non-Coding Regions in Transcriptomes

Authors: Angela U. Makolo

Abstract:

Protein-coding and Non-coding regions determine the biology of a sequenced transcriptome. Research advances have shown that Non-coding regions are important in disease progression and clinical diagnosis. Existing bioinformatics tools have been targeted towards Protein-coding regions alone. Therefore, there are challenges associated with gaining biological insights from transcriptome sequence data. These tools are also limited to computationally intensive sequence alignment, which is inadequate and less accurate to identify both Protein-coding and Non-coding regions. Alignment-free techniques can overcome the limitation of identifying both regions. Therefore, this study was designed to develop an efficient sequence alignment-free model for identifying both Protein-coding and Non-coding regions in sequenced transcriptomes. Feature grouping and randomization procedures were applied to the input transcriptomes (37,503 data points). Successive iterations were carried out to compute the gradient vector that converged the developed Protein-coding and Non-coding Region Identifier (PNRI) model to the approximate coefficient vector. The logistic regression algorithm was used with a sigmoid activation function. A parameter vector was estimated for every sample in 37,503 data points in a bid to reduce the generalization error and cost. Maximum Likelihood Estimation (MLE) was used for parameter estimation by taking the log-likelihood of six features and combining them into a summation function. Dynamic thresholding was used to classify the Protein-coding and Non-coding regions, and the Receiver Operating Characteristic (ROC) curve was determined. The generalization performance of PNRI was determined in terms of F1 score, accuracy, sensitivity, and specificity. The average generalization performance of PNRI was determined using a benchmark of multi-species organisms. The generalization error for identifying Protein-coding and Non-coding regions decreased from 0.514 to 0.508 and to 0.378, respectively, after three iterations. The cost (difference between the predicted and the actual outcome) also decreased from 1.446 to 0.842 and to 0.718, respectively, for the first, second and third iterations. The iterations terminated at the 390th epoch, having an error of 0.036 and a cost of 0.316. The computed elements of the parameter vector that maximized the objective function were 0.043, 0.519, 0.715, 0.878, 1.157, and 2.575. The PNRI gave an ROC of 0.97, indicating an improved predictive ability. The PNRI identified both Protein-coding and Non-coding regions with an F1 score of 0.970, accuracy (0.969), sensitivity (0.966), and specificity of 0.973. Using 13 non-human multi-species model organisms, the average generalization performance of the traditional method was 74.4%, while that of the developed model was 85.2%, thereby making the developed model better in the identification of Protein-coding and Non-coding regions in transcriptomes. The developed Protein-coding and Non-coding region identifier model efficiently identified the Protein-coding and Non-coding transcriptomic regions. It could be used in genome annotation and in the analysis of transcriptomes.

Keywords: sequence alignment-free model, dynamic thresholding classification, input randomization, genome annotation

Procedia PDF Downloads 23
16635 Application of Constructivist-Based (5E’s) Instructional Approach on Pupils’ Retention: A Case Study in Primary Mathematics in Enugu State

Authors: Ezeamagu M.U, Madu B.C

Abstract:

This study was designed to investigate the efficacy of 5Es constructivist-based instructional model on students’ retention in primary Mathematics. 5Es stands for Engagement, Exploration, Explanation, Elaboration and Evaluation. The study adopted the pre test post test non-equivalent control group quasi-experimental research design. The sample size for the study was one hundred and thirty four pupils (134), seventy six male (76) and fifty eight female (58) from two primary schools in Nsukka education zone. Two intact classes in each of the sampled schools comprising all the primary four pupils were used. Each of the schools was given the opportunity of being assigned randomly to either experimental or control group. The Experimental group was taught using 5Es model while the control group was taught using the conventional method. Two research questions were formulated to guide the study and three hypotheses were tested at p ≤ 0. 05. A Fraction Achievement Test (FAT) of ten (10) questions were used to obtain data on pupils’ retention. Research questions were answered using mean and standard deviation while hypotheses were tested using analysis of covariance (ANCOVA). The result revealed that the 5Es model was more effective than the conventional method of teaching in enhancing pupils’ performance and retention in mathematics, secondly there is no significant difference in the mean retention scores of male and female students taught using 5Es instructional model. Based on the findings, it was recommended among other things, that the 5Es instructional model should be adopted in the teaching of mathematics in primary level of the educational system. Seminar, workshops and conferences should be mounted by professional bodies, federal and state ministries of education on the use of 5Es model. This will enable the mathematics educator, serving teachers, students and all to benefit from the approach.

Keywords: constructivist, education, mathematics, primary, retention

Procedia PDF Downloads 418
16634 Improvement of Piezoresistive Pressure Sensor Accuracy by Means of Current Loop Circuit Using Optimal Digital Signal Processing

Authors: Peter A. L’vov, Roman S. Konovalov, Alexey A. L’vov

Abstract:

The paper presents the advanced digital modification of the conventional current loop circuit for pressure piezoelectric transducers. The optimal DSP algorithms of current loop responses by the maximum likelihood method are applied for diminishing of measurement errors. The loop circuit has some additional advantages such as the possibility to operate with any type of resistance or reactance sensors, and a considerable increase in accuracy and quality of measurements to be compared with AC bridges. The results obtained are dedicated to replace high-accuracy and expensive measuring bridges with current loop circuits.

Keywords: current loop, maximum likelihood method, optimal digital signal processing, precise pressure measurement

Procedia PDF Downloads 498
16633 Elaboration and Characterization of Silver Nanoparticles for Therapeutic and Environmental Applications

Authors: Manel Bouloudenine, Karima Djeddou, Hadjer Ben Manser, Hana Soualah Alila, Mohmed Bououdina

Abstract:

This survey research involves the elaboration and characterization of silver nanoparticles for therapeutic and environmental applications. The silver nanoparticles "Ag NPs" were synthesized by reducing AgNO3 with microwaves. The characterization of nanoparticles was done by using Transmission Electron Microscopy " TEM ", Energy Dispersive Spectroscopy "EDS", Selected Area Electron Diffraction "SEAD", UV-Visible Spectroscopy and Dynamic Light Scattering "DLS". Transmission Electron Microscopy and Electron Diffraction have confirmed the nanoscale, the shape, and the crystalline quality of as synthesized silver nanoparticles. Elementary analysis has proved the purity of Ag NPs and the presence of the Surface Plasmon Resonance phenomenon "SPR". A strong absorption shift was observed in the visible range of the UV-visible spectrum of as synthesized Ag NPs, which indicates the presence of metallic silver. When the strong absorption in the ultraviolet range of the spectrum has revealed the presence of ionic Ag NPs ionic Ag aggregates species. The autocorrelation function measured by the Dynamic Light Scattering has shown a strong monodispersed character of Ag NPs, which is indicated by the presence of a single size population, with a minima and a maxima laying between 40 and 111 nm. Related to other research, our results confirm the performance properties of as synthesized Ag NPs, which allows them to be performing in many technological applications, including therapeutic and environmental ones.

Keywords: silvers nanoparticles, microwaves, EDS, TEM

Procedia PDF Downloads 119
16632 Predictors of Behavior Modification Prior to Bariatric Surgery

Authors: Rosemarie Basile, Maria Loizos, John Pallarino, Karen Gibbs

Abstract:

Given that complications can be significant following bariatric surgery and with rates of long-term success measured in excess weight loss varying as low as 33% after five years, an understanding of the psychological factors that may mitigate findings and increase success and result in better screening and supports prior to surgery are critical. An internally oriented locus of control (LOC) has been identified as a predictor for success in obesity therapy, but has not been investigated within the context of bariatric surgery. It is hypothesized that making behavioral changes prior to surgery which mirror those that are required post-surgery may ultimately predict long-term success. 122 subjects participated in a clinical interview and completed self-report measures including the Multidimensional Health Locus of Control Scale, Overeating Questionnaire (OQ), and Lifestyle Questionnaire (LQ). Pearson correlations were computed between locus of control orientation and likelihood to make behavior changes prior to surgery. Pearson correlations revealed a positive correlation between locus of control and likelihood to make behavior changes r = 0.23, p < .05. As hypothesized, there was a significant correlation between internal locus of control and likelihood to make behavior changes. Participants with a higher LOC believe that they are able to make decisions about their own health. Future research will focus on whether this positive correlation is a predictor for future bariatric surgery success.

Keywords: bariatric surgery, behavior modification, health locus of control, overeating questionnaire

Procedia PDF Downloads 277
16631 Psychodidactic Strategies to Facilitate Flow of Logical Thinking in Preparation of Academic Documents

Authors: Deni Stincer Gomez, Zuraya Monroy Nasr, Luis Pérez Alvarez

Abstract:

The preparation of academic documents such as thesis, articles and research projects is one of the requirements of the higher educational level. These documents demand the implementation of logical argumentative thinking which is experienced and executed with difficulty. To mitigate the effect of these difficulties this study designed a thesis seminar, with which the authors have seven years of experience. It is taught in a graduate program in Psychology at the National Autonomous University of Mexico. In this study the authors use the Toulmin model as a mental heuristic and for the application of a set of psychodidactic strategies that facilitate the elaboration of the plot and culmination of the thesis. The efficiency in obtaining the degree in the groups exposed to the seminar has increased by 94% compared to the 10% that existed in the generations that were not exposed to the seminar. In this article the authors will emphasize the psychodidactic strategies used. The Toulmin model alone does not guarantee the success achieved. A set of actions of a psychological nature (almost psychotherapeutic) and didactics of the teacher also seem to contribute. These are actions that derive from an understanding of the psychological, epistemological and ontogenetic obstacles and the most frequent errors in which thought tends to fall when it is demanded a logical course. The authors have grouped the strategies into three groups: 1) strategies to facilitate logical thinking, 2) strategies to strengthen the scientific self and 3) strategies to facilitate the act of writing the text. In this work the authors delve into each of them.

Keywords: psychodidactic strategies, logical thinking, academic documents, Toulmin model

Procedia PDF Downloads 152
16630 A Partially Accelerated Life Test Planning with Competing Risks and Linear Degradation Path under Tampered Failure Rate Model

Authors: Fariba Azizi, Firoozeh Haghighi, Viliam Makis

Abstract:

In this paper, we propose a method to model the relationship between failure time and degradation for a simple step stress test where underlying degradation path is linear and different causes of failure are possible. It is assumed that the intensity function depends only on the degradation value. No assumptions are made about the distribution of the failure times. A simple step-stress test is used to shorten failure time of products and a tampered failure rate (TFR) model is proposed to describe the effect of the changing stress on the intensities. We assume that some of the products that fail during the test have a cause of failure that is only known to belong to a certain subset of all possible failures. This case is known as masking. In the presence of masking, the maximum likelihood estimates (MLEs) of the model parameters are obtained through an expectation-maximization (EM) algorithm by treating the causes of failure as missing values. The effect of incomplete information on the estimation of parameters is studied through a Monte-Carlo simulation. Finally, a real example is analyzed to illustrate the application of the proposed methods.

Keywords: cause of failure, linear degradation path, reliability function, expectation-maximization algorithm, intensity, masked data

Procedia PDF Downloads 300
16629 An Extension of the Generalized Extreme Value Distribution

Authors: Serge Provost, Abdous Saboor

Abstract:

A q-analogue of the generalized extreme value distribution which includes the Gumbel distribution is introduced. The additional parameter q allows for increased modeling flexibility. The resulting distribution can have a finite, semi-infinite or infinite support. It can also produce several types of hazard rate functions. The model parameters are determined by making use of the method of maximum likelihood. It will be shown that it compares favourably to three related distributions in connection with the modeling of a certain hydrological data set.

Keywords: extreme value theory, generalized extreme value distribution, goodness-of-fit statistics, Gumbel distribution

Procedia PDF Downloads 305
16628 Discovering Semantic Links Between Synonyms, Hyponyms and Hypernyms

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This proposal aims for semantic enrichment between glossaries using the Simple Knowledge Organization System (SKOS) vocabulary to discover synonyms, hyponyms and hyperonyms semiautomatically, in Brazilian Portuguese, generating new semantic relationships based on WordNet. To evaluate the quality of this proposed model, experiments were performed by the use of two sets containing new relations, being one generated automatically and the other manually mapped by the domain expert. The applied evaluation metrics were precision, recall, f-score, and confidence interval. The results obtained demonstrate that the applied method in the field of Oil Production and Extraction (E&P) is effective, which suggests that it can be used to improve the quality of terminological mappings. The procedure, although adding complexity in its elaboration, can be reproduced in others domains.

Keywords: ontology matching, mapping enrichment, semantic web, linked data, SKOS

Procedia PDF Downloads 174
16627 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 26
16626 Improving Flash Flood Forecasting with a Bayesian Probabilistic Approach: A Case Study on the Posina Basin in Italy

Authors: Zviad Ghadua, Biswa Bhattacharya

Abstract:

The Flash Flood Guidance (FFG) provides the rainfall amount of a given duration necessary to cause flooding. The approach is based on the development of rainfall-runoff curves, which helps us to find out the rainfall amount that would cause flooding. An alternative approach, mostly experimented with Italian Alpine catchments, is based on determining threshold discharges from past events and on finding whether or not an oncoming flood has its magnitude more than some critical discharge thresholds found beforehand. Both approaches suffer from large uncertainties in forecasting flash floods as, due to the simplistic approach followed, the same rainfall amount may or may not cause flooding. This uncertainty leads to the question whether a probabilistic model is preferable over a deterministic one in forecasting flash floods. We propose the use of a Bayesian probabilistic approach in flash flood forecasting. A prior probability of flooding is derived based on historical data. Additional information, such as antecedent moisture condition (AMC) and rainfall amount over any rainfall thresholds are used in computing the likelihood of observing these conditions given a flash flood has occurred. Finally, the posterior probability of flooding is computed using the prior probability and the likelihood. The variation of the computed posterior probability with rainfall amount and AMC presents the suitability of the approach in decision making in an uncertain environment. The methodology has been applied to the Posina basin in Italy. From the promising results obtained, we can conclude that the Bayesian approach in flash flood forecasting provides more realistic forecasting over the FFG.

Keywords: flash flood, Bayesian, flash flood guidance, FFG, forecasting, Posina

Procedia PDF Downloads 102
16625 Brand Management Model in Professional Football League

Authors: Vajiheh Javani

Abstract:

The study aims to examine brand image in Iran's professional Football League (2014-2015). The study was descriptive survey one. A sample of Iranian professional football league fans (N=911) responded four items questionnaire. A structural equation model (SEM) test with maximum likelihood estimation was performed to test the relationships among the research variables. The analyses of data showed three dimensions of brand image influenced on fan’s brand loyalty of which the attitude was the most important. Benefits and attributes were placed in the second and third rank respectively. According to results, brand image plays a pivotal role between Iranian fans brand loyalty. Create an attractive and desirable brand image in the fans mind increases brand loyalty. Moreover due to, revenue and profits increase through ticket sales and products of club and also attract more sponsors.

Keywords: brand management, sport industry, brand image, fans

Procedia PDF Downloads 308
16624 Frequency Offset Estimation Schemes Based on ML for OFDM Systems in Non-Gaussian Noise Environments

Authors: Keunhong Chae, Seokho Yoon

Abstract:

In this paper, frequency offset (FO) estimation schemes robust to the non-Gaussian noise environments are proposed for orthogonal frequency division multiplexing (OFDM) systems. First, a maximum-likelihood (ML) estimation scheme in non-Gaussian noise environments is proposed, and then, the complexity of the ML estimation scheme is reduced by employing a reduced set of candidate values. In numerical results, it is demonstrated that the proposed schemes provide a significant performance improvement over the conventional estimation scheme in non-Gaussian noise environments while maintaining the performance similar to the estimation performance in Gaussian noise environments.

Keywords: frequency offset estimation, maximum-likelihood, non-Gaussian noise environment, OFDM, training symbol

Procedia PDF Downloads 321
16623 Tenants Use Less Input on Rented Plots: Evidence from Northern Ethiopia

Authors: Desta Brhanu Gebrehiwot

Abstract:

The study aims to investigate the impact of land tenure arrangements on fertilizer use per hectare in Northern Ethiopia. Household and Plot level data are used for analysis. Land tenure contracts such as sharecropping and fixed rent arrangements have endogeneity. Different unobservable characteristics may affect renting-out decisions. Thus, the appropriate method of analysis was the instrumental variable estimation technic. Therefore, the family of instrumental variable estimation methods two-stage least-squares regression (2SLS, the generalized method of moments (GMM), Limited information maximum likelihood (LIML), and instrumental variable Tobit (IV-Tobit) was used. Besides, a method to handle a binary endogenous variable is applied, which uses a two-step estimation. In the first step probit model includes instruments, and in the second step, maximum likelihood estimation (MLE) (“etregress” command in Stata 14) was used. There was lower fertilizer use per hectare on sharecropped and fixed rented plots relative to owner-operated. The result supports the Marshallian inefficiency principle in sharecropping. The difference in fertilizer use per hectare could be explained by a lack of incentivized detailed contract forms, such as giving more proportion of the output to the tenant under sharecropping contracts, which motivates to use of more fertilizer in rented plots to maximize the production because most sharecropping arrangements share output equally between tenants and landlords.

Keywords: tenure-contracts, endogeneity, plot-level data, Ethiopia, fertilizer

Procedia PDF Downloads 49
16622 Time of Week Intensity Estimation from Interval Censored Data with Application to Police Patrol Planning

Authors: Jiahao Tian, Michael D. Porter

Abstract:

Law enforcement agencies are tasked with crime prevention and crime reduction under limited resources. Having an accurate temporal estimate of the crime rate would be valuable to achieve such a goal. However, estimation is usually complicated by the interval-censored nature of crime data. We cast the problem of intensity estimation as a Poisson regression using an EM algorithm to estimate the parameters. Two special penalties are added that provide smoothness over the time of day and day of the week. This approach presented here provides accurate intensity estimates and can also uncover day-of-week clusters that share the same intensity patterns. Anticipating where and when crimes might occur is a key element to successful policing strategies. However, this task is complicated by the presence of interval-censored data. The censored data refers to the type of data that the event time is only known to lie within an interval instead of being observed exactly. This type of data is prevailing in the field of criminology because of the absence of victims for certain types of crime. Despite its importance, the research in temporal analysis of crime has lagged behind the spatial component. Inspired by the success of solving crime-related problems with a statistical approach, we propose a statistical model for the temporal intensity estimation of crime with censored data. The model is built on Poisson regression and has special penalty terms added to the likelihood. An EM algorithm was derived to obtain maximum likelihood estimates, and the resulting model shows superior performance to the competing model. Our research is in line with the smart policing initiative (SPI) proposed by the Bureau Justice of Assistance (BJA) as an effort to support law enforcement agencies in building evidence-based, data-driven law enforcement tactics. The goal is to identify strategic approaches that are effective in crime prevention and reduction. In our case, we allow agencies to deploy their resources for a relatively short period of time to achieve the maximum level of crime reduction. By analyzing a particular area within cities where data are available, our proposed approach could not only provide an accurate estimate of intensities for the time unit considered but a time-variation crime incidence pattern. Both will be helpful in the allocation of limited resources by either improving the existing patrol plan with the understanding of the discovery of the day of week cluster or supporting extra resources available.

Keywords: cluster detection, EM algorithm, interval censoring, intensity estimation

Procedia PDF Downloads 38
16621 Financial Market Reaction to Non-Financial Reports

Authors: Petra Dilling

Abstract:

This study examines the market reaction to the publication of integrated reports for a sample of 316 global companies for the reporting year 2018. Applying event study methodology, we find significant cumulative average abnormal returns (CAARs) after the publication date. To ensure robust estimation resultsthe three-factor model, according to Fama and French, is used as well as a market-adjusted model, a CAPM and a Frama-French model taking GARCH effects into account. We find a significant positive CAAR after the publication day of the integrated report. Our results suggest that investors react to information provided in the integrated report and that they react differently to the annual financial report. Furthermore, our cross-sectional analysis confirms that companies with a significant positive cumulative average abnormal show certain characteristic. It was found that European companies have a higher likelihood to experience a stronger significant positive market reaction to their integrated report publication.

Keywords: integrated report, event methodology, cumulative abnormal return, sustainability, CAPM

Procedia PDF Downloads 113
16620 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 360
16619 Modeling of Traffic Turning Movement

Authors: Michael Tilahun Mulugeta

Abstract:

Pedestrians are the most vulnerable road users as they are more exposed to the risk of collusion. Pedestrian safety at road intersections still remains the most vital and yet unsolved issue in Addis Ababa, Ethiopia. One of the critical points in pedestrian safety is the occurrence of conflict between turning vehicle and pedestrians at un-signalized intersection. However, a better understanding of the factors that affect the likelihood of the conflicts would help provide direction for countermeasures aimed at reducing the number of crashes. This paper has sorted to explore a model to describe the relation between traffic conflicts and influencing factors using Multiple Linear regression methodology. In this research the main focus is to study the interaction of turning (left & right) vehicle with pedestrian at unsignalized intersections. The specific objectives also to determine factors that affect the number of potential conflicts and develop a model of potential conflict.

Keywords: potential, regression analysis, pedestrian, conflicts

Procedia PDF Downloads 18