Search results for: elaboration likelihood model theory
20585 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm
Authors: Suparman Suparman
Abstract:
A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)
Procedia PDF Downloads 35620584 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 24220583 A Systemic Maturity Model
Authors: Emir H. Pernet, Jeimy J. Cano
Abstract:
Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.Keywords: GRC, maturity model, systems theory, viable system model
Procedia PDF Downloads 31220582 Recovery of Petroleum Reservoir by Waterflooding Technique
Authors: Zabihullah Mahdi, Khwaja Naweed Seddiqi, Shigeo Honma
Abstract:
Through many types of research and practical studies, it has been identified that the average oil recovery factor of a petroleum reservoir is about 30 to 35 %. This study is focused on enhanced oil recovery by laboratory experiment and graphical investigation based on Buckley-Leverett theory. Horizontal oil displacement by water, in a petroleum reservoir is analyzed under the Buckley-Leverett frontal displacement theory. The extraction and prerequisite of this theory are based and pursued focusing on the key factors that control displacement. The theory is executable to the waterflooding method, which is generally employed in petroleum engineering reservoirs to sustain oil production recovery, and the techniques for evaluating the average water saturation behind the water front and the oil recovery factors in the reservoirs are presented. In this paper, the Buckley-Leverett theory handled to an experimental model and the amount of recoverable oil are investigated to be over 35%. The irreducible water saturation, viz. connate water saturation, in the reservoir is also a significant inspiration for the recovery.Keywords: Buckley-Leverett theory, waterflooding technique, petroleum engineering, immiscible displacement
Procedia PDF Downloads 25920581 A Unified Theory of the Primary Psychological and Social Sciences
Authors: George McMillan
Abstract:
This paper introduces the methodology to create a baseline equation for the philosophical and social sciences in the behavioral-political-economic-demographic sequence. The two major ideological political-economic philosophies (Hume-Smith and Marx-Engels) are systematized into competing integrated three dimensional behavioral-political-economic models. The paper argues that Hume-Smith’s empathy-sympathy behavioral assumptions are a sufficient starting point to create the integrated causal model sought by Tooby and Cosmides. The author then shows that the prerequisite advances in psychology and demographic studies now exist to generate the universal economic theory sought by von Neumann-Morgenstern and the integrated behavioral-economic method of Gintis—a psychological (i.e., behavioral) socio-economic model. By updating Hume-Smith’s work with a modern understanding of psychology, as presented by Fromm and others, a new integrated societal model as postulated by Harsanyi can be created that intertwines the social and psychological sciences. The author argues that this fundamentally psychology-based model also can serve as a baseline equation for all social sciences as desired by Kant and Mach, as well as the ahistorical (psychological) philosophic model noted by Husserl, Heidegger, Tillich, and Strauss. The author concludes with a discussion of the necessary next steps to generating a detailed model that fuses these disciplines.Keywords: Unified Social Theory
Procedia PDF Downloads 37820580 Classification of Health Risk Factors to Predict the Risk of Falling in Older Adults
Authors: L. Lindsay, S. A. Coleman, D. Kerr, B. J. Taylor, A. Moorhead
Abstract:
Cognitive decline and frailty is apparent in older adults leading to an increased likelihood of the risk of falling. Currently health care professionals have to make professional decisions regarding such risks, and hence make difficult decisions regarding the future welfare of the ageing population. This study uses health data from The Irish Longitudinal Study on Ageing (TILDA), focusing on adults over the age of 50 years, in order to analyse health risk factors and predict the likelihood of falls. This prediction is based on the use of machine learning algorithms whereby health risk factors are used as inputs to predict the likelihood of falling. Initial results show that health risk factors such as long-term health issues contribute to the number of falls. The identification of such health risk factors has the potential to inform health and social care professionals, older people and their family members in order to mitigate daily living risks.Keywords: classification, falls, health risk factors, machine learning, older adults
Procedia PDF Downloads 15020579 Selection of Appropriate Classification Technique for Lithological Mapping of Gali Jagir Area, Pakistan
Authors: Khunsa Fatima, Umar K. Khattak, Allah Bakhsh Kausar
Abstract:
Satellite images interpretation and analysis assist geologists by providing valuable information about geology and minerals of an area to be surveyed. A test site in Fatejang of district Attock has been studied using Landsat ETM+ and ASTER satellite images for lithological mapping. Five different supervised image classification techniques namely maximum likelihood, parallelepiped, minimum distance to mean, mahalanobis distance and spectral angle mapper have been performed on both satellite data images to find out the suitable classification technique for lithological mapping in the study area. Results of these five image classification techniques were compared with the geological map produced by Geological Survey of Pakistan. The result of maximum likelihood classification technique applied on ASTER satellite image has the highest correlation of 0.66 with the geological map. Field observations and XRD spectra of field samples also verified the results. A lithological map was then prepared based on the maximum likelihood classification of ASTER satellite image.Keywords: ASTER, Landsat-ETM+, satellite, image classification
Procedia PDF Downloads 39620578 A Game-Theory-Based Price-Optimization Algorithm for the Simulation of Markets Using Agent-Based Modelling
Authors: Juan Manuel Sanchez-Cartas, Gonzalo Leon
Abstract:
A price competition algorithm for ABMs based on game theory principles is proposed to deal with the simulation of theoretical market models. The algorithm is applied to the classical Hotelling’s model and to a two-sided market model to show it leads to the optimal behavior predicted by theoretical models. However, when theoretical models fail to predict the equilibrium, the algorithm is capable of reaching a feasible outcome. Results highlight that the algorithm can be implemented in other simulation models to guarantee rational users and endogenous optimal behaviors. Also, it can be applied as a tool of verification given that is theoretically based.Keywords: agent-based models, algorithmic game theory, multi-sided markets, price optimization
Procedia PDF Downloads 45820577 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 46920576 Applying the Quad Model to Estimate the Implicit Self-Esteem of Patients with Depressive Disorders: Comparing the Psychometric Properties with the Implicit Association Test Effect
Authors: Yi-Tung Lin
Abstract:
Researchers commonly assess implicit self-esteem with the Implicit Association Test (IAT). The IAT’s measure, often referred to as the IAT effect, indicates the strengths of automatic preferences for the self relative to others, which is often considered an index of implicit self-esteem. However, based on the Dual-process theory, the IAT does not rely entirely on the automatic process; it is also influenced by a controlled process. The present study, therefore, analyzed the IAT data with the Quad model, separating four processes on the IAT performance: the likelihood that automatic association is activated by the stimulus in the trial (AC); that a correct response is discriminated in the trial (D); that the automatic bias is overcome in favor of a deliberate response (OB); and that when the association is not activated, and the individual fails to discriminate a correct answer, there is a guessing or response bias drives the response (G). The AC and G processes are automatic, while the D and OB processes are controlled. The AC parameter is considered as the strength of the association activated by the stimulus, which reflects what implicit measures of social cognition aim to assess. The stronger the automatic association between self and positive valence, the more likely it will be activated by a relevant stimulus. Therefore, the AC parameter was used as the index of implicit self-esteem in the present study. Meanwhile, the relationship between implicit self-esteem and depression is not fully investigated. In the cognitive theory of depression, it is assumed that the negative self-schema is crucial in depression. Based on this point of view, implicit self-esteem would be negatively associated with depression. However, the results among empirical studies are inconsistent. The aims of the present study were to examine the psychometric properties of the AC (i.e., test-retest reliability and its correlations with explicit self-esteem and depression) and compare it with that of the IAT effect. The present study had 105 patients with depressive disorders completing the Rosenberg Self-Esteem Scale, Beck Depression Inventory-II and the IAT on the pretest. After at least 3 weeks, the participants completed the second IAT. The data were analyzed by the latent-trait multinomial processing tree model (latent-trait MPT) with the TreeBUGS package in R. The result showed that the latent-trait MPT had a satisfactory model fit. The effect size of test-retest reliability of the AC and the IAT effect were medium (r = .43, p < .0001) and small (r = .29, p < .01) respectively. Only the AC showed a significant correlation with explicit self-esteem (r = .19, p < .05). Neither of the two indexes was correlated with depression. Collectively, the AC parameter was a satisfactory index of implicit self-esteem compared with the IAT effect. Also, the present study supported the results that implicit self-esteem was not correlated with depression.Keywords: cognitive modeling, implicit association test, implicit self-esteem, quad model
Procedia PDF Downloads 12820575 A Study of Mode Choice Model Improvement Considering Age Grouping
Authors: Young-Hyun Seo, Hyunwoo Park, Dong-Kyu Kim, Seung-Young Kho
Abstract:
The purpose of this study is providing an improved mode choice model considering parameters including age grouping of prime-aged and old age. In this study, 2010 Household Travel Survey data were used and improper samples were removed through the analysis. Chosen alternative, date of birth, mode, origin code, destination code, departure time, and arrival time are considered from Household Travel Survey. By preprocessing data, travel time, travel cost, mode, and ratio of people aged 45 to 55 years, 55 to 65 years and over 65 years were calculated. After the manipulation, the mode choice model was constructed using LIMDEP by maximum likelihood estimation. A significance test was conducted for nine parameters, three age groups for three modes. Then the test was conducted again for the mode choice model with significant parameters, travel cost variable and travel time variable. As a result of the model estimation, as the age increases, the preference for the car decreases and the preference for the bus increases. This study is meaningful in that the individual and households characteristics are applied to the aggregate model.Keywords: age grouping, aging, mode choice model, multinomial logit model
Procedia PDF Downloads 32220574 An Empirical Investigation of Mobile Banking Services Adoption in Pakistan
Authors: Aijaz A. Shaikh, Richard Glavee-Geo, Heikki Karjaluoto
Abstract:
Adoption of Information Systems (IS) is receiving increasing attention such that its implications have been closely monitored and studied by the IS management community, industry and professional gatekeepers. Building on previous research regarding the adoption of technology, this paper develops and validates an integrated model of the adoption of mobile banking. The model originates from the Technology Acceptance Model (TAM) and the Theory of Planned Behaviour (TPB). This paper intends to offer a preliminary scrutiny of the antecedents of the adoption of mobile banking services in the context of a developing country. Data was collected from Pakistan. The findings showed that an integrated TAM and TPB model greatly explains the adoption intention of mobile banking; and perceived behavioural control and its antecedents play a significant role in predicting adoption Theoretical and managerial implications of findings are presented and discussed.Keywords: developing country, mobile banking service adoption, technology acceptance model, theory of planned behavior
Procedia PDF Downloads 42020573 Innovation and Economic Growth Model of East Asian Countries: The Adaptability of the Model in Ethiopia
Authors: Khalid Yousuf Ahmed
Abstract:
At the beginning of growth period, East Asian countries achieved impressive economic growth for the decades. They transformed from agricultural economy toward industrialization and contributed to dynamic structural transformation. The achievements were driven by government-led development policies that implemented effective innovation policy to boost technological capability of local firms. Recently, most Sub-Saharan African have been showing sustainable growth. Exceptionally, Ethiopia has been recording double-digit growth for a decade. Hence, Ethiopia has claimed to follow the footstep of East Asia development model. The study is going to examine whether Ethiopia can replicate innovation and economic growth model of East Asia by using Japan, Taiwan, South Korea and China as a case to illustrate their model of growth. This research will be based on empirical data gathering and extended theory of national innovation system and economic growth theory. Moreover, the methodology is based on Knowledge Assessment Methodology (KAM) and also employing cross-countries regression analysis. The results explained that there is a significant relationship between innovation indicators and economic growth in East Asian countries while the relationship is non-existing for Ethiopia except implementing similar policies and achieving similar growth trend. Therefore, Ethiopia needs to introduce inclusive policies that give priority to improving human capital and invest on the knowledge-based economy to replicate East Asian Model.Keywords: economic growth, FDI, endogenous growth theory, East Asia model
Procedia PDF Downloads 27520572 A Periodogram-Based Spectral Method Approach: The Relationship between Tourism and Economic Growth in Turkey
Authors: Mesut BALIBEY, Serpil TÜRKYILMAZ
Abstract:
A popular topic in the econometrics and time series area is the cointegrating relationships among the components of a nonstationary time series. Engle and Granger’s least squares method and Johansen’s conditional maximum likelihood method are the most widely-used methods to determine the relationships among variables. Furthermore, a method proposed to test a unit root based on the periodogram ordinates has certain advantages over conventional tests. Periodograms can be calculated without any model specification and the exact distribution under the assumption of a unit root is obtained. For higher order processes the distribution remains the same asymptotically. In this study, in order to indicate advantages over conventional test of periodograms, we are going to examine a possible relationship between tourism and economic growth during the period 1999:01-2010:12 for Turkey by using periodogram method, Johansen’s conditional maximum likelihood method, Engle and Granger’s ordinary least square method.Keywords: cointegration, economic growth, periodogram ordinate, tourism
Procedia PDF Downloads 27020571 Clinical Prediction Score for Ruptured Appendicitis In ED
Authors: Thidathit Prachanukool, Chaiyaporn Yuksen, Welawat Tienpratarn, Sorravit Savatmongkorngul, Panvilai Tangkulpanich, Chetsadakon Jenpanitpong, Yuranan Phootothum, Malivan Phontabtim, Promphet Nuanprom
Abstract:
Background: Ruptured appendicitis has a high morbidity and mortality and requires immediate surgery. The Alvarado Score is used as a tool to predict the risk of acute appendicitis, but there is no such score for predicting rupture. This study aimed to developed the prediction score to determine the likelihood of ruptured appendicitis in an Asian population. Methods: This study was diagnostic, retrospectively cross-sectional and exploratory model at the Emergency Medicine Department in Ramathibodi Hospital between March 2016 and March 2018. The inclusion criteria were age >15 years and an available pathology report after appendectomy. Clinical factors included gender, age>60 years, right lower quadrant pain, migratory pain, nausea and/or vomiting, diarrhea, anorexia, fever>37.3°C, rebound tenderness, guarding, white blood cell count, polymorphonuclear white blood cells (PMN)>75%, and the pain duration before presentation. The predictive model and prediction score for ruptured appendicitis was developed by multivariable logistic regression analysis. Result: During the study period, 480 patients met the inclusion criteria; of these, 77 (16%) had ruptured appendicitis. Five independent factors were predictive of rupture, age>60 years, fever>37.3°C, guarding, PMN>75%, and duration of pain>24 hours to presentation. A score > 6 increased the likelihood ratio of ruptured appendicitis by 3.88 times. Conclusion: Using the Ramathibodi Welawat Ruptured Appendicitis Score. (RAMA WeRA Score) developed in this study, a score of > 6 was associated with ruptured appendicitis.Keywords: predictive model, risk score, ruptured appendicitis, emergency room
Procedia PDF Downloads 16620570 Classical and Bayesian Inference of the Generalized Log-Logistic Distribution with Applications to Survival Data
Authors: Abdisalam Hassan Muse, Samuel Mwalili, Oscar Ngesa
Abstract:
A generalized log-logistic distribution with variable shapes of the hazard rate was introduced and studied, extending the log-logistic distribution by adding an extra parameter to the classical distribution, leading to greater flexibility in analysing and modeling various data types. The proposed distribution has a large number of well-known lifetime special sub-models such as; Weibull, log-logistic, exponential, and Burr XII distributions. Its basic mathematical and statistical properties were derived. The method of maximum likelihood was adopted for estimating the unknown parameters of the proposed distribution, and a Monte Carlo simulation study is carried out to assess the behavior of the estimators. The importance of this distribution is that its tendency to model both monotone (increasing and decreasing) and non-monotone (unimodal and bathtub shape) or reversed “bathtub” shape hazard rate functions which are quite common in survival and reliability data analysis. Furthermore, the flexibility and usefulness of the proposed distribution are illustrated in a real-life data set and compared to its sub-models; Weibull, log-logistic, and BurrXII distributions and other parametric survival distributions with 3-parmaeters; like the exponentiated Weibull distribution, the 3-parameter lognormal distribution, the 3- parameter gamma distribution, the 3-parameter Weibull distribution, and the 3-parameter log-logistic (also known as shifted log-logistic) distribution. The proposed distribution provided a better fit than all of the competitive distributions based on the goodness-of-fit tests, the log-likelihood, and information criterion values. Finally, Bayesian analysis and performance of Gibbs sampling for the data set are also carried out.Keywords: hazard rate function, log-logistic distribution, maximum likelihood estimation, generalized log-logistic distribution, survival data, Monte Carlo simulation
Procedia PDF Downloads 20220569 Internet of Things Professional Construction Building through the School-Enterprise Cooperation
Authors: Jumin Zhao, Na Li, Dengao Li, Yujuan Yan
Abstract:
As the rapid rise of the networking industry, the shortage of Internet of Things (IoT) talented people greatly stimulates the majority of colleges to speed up the pace of professional networking reform. Caused by the construction of the original specialty, many problems appear such as the vague specialty, the mixed theoretical, the poor practical ability and the different goal. To solve the issues above, we build a ‘theory-practice-theory-improvement’ four-step model of school-enterprise integration of personnel training. Besides, we integrate the advanced teaching philosophy: flip class and Mu class, making IoT teaching more professional and the ability of students more comprehensive.Keywords: IoT, theory-practice-theory-promotion, major construction, school-enterprise cooperation
Procedia PDF Downloads 38120568 Structure Function and Violation of Scale Invariance in NCSM: Theory and Numerical Analysis
Authors: M. R. Bekli, N. Mebarki, I. Chadou
Abstract:
In this study, we focus on the structure functions and violation of scale invariance in the context of non-commutative standard model (NCSM). We find that this violation appears in the first order of perturbation theory and a non-commutative version of the DGLAP evolution equation is deduced. Numerical analysis and comparison with experimental data imposes a new bound on the non-commutative parameter.Keywords: NCSM, structure function, DGLAP equation, standard model
Procedia PDF Downloads 61120567 Reliability Assessment and Failure Detection in a Complex Human-Machine System Using Agent-Based and Human Decision-Making Modeling
Authors: Sanjal Gavande, Thomas Mazzuchi, Shahram Sarkani
Abstract:
In a complex aerospace operational environment, identifying failures in a procedure involving multiple human-machine interactions are difficult. These failures could lead to accidents causing loss of hardware or human life. The likelihood of failure further increases if operational procedures are tested for a novel system with multiple human-machine interfaces and with no prior performance data. The existing approach in the literature of reviewing complex operational tasks in a flowchart or tabular form doesn’t provide any insight into potential system failures due to human decision-making ability. To address these challenges, this research explores an agent-based simulation approach for reliability assessment and fault detection in complex human-machine systems while utilizing a human decision-making model. The simulation will predict the emergent behavior of the system due to the interaction between humans and their decision-making capability with the varying states of the machine and vice-versa. Overall system reliability will be evaluated based on a defined set of success-criteria conditions and the number of recorded failures over an assigned limit of Monte Carlo runs. The study also aims at identifying high-likelihood failure locations for the system. The research concludes that system reliability and failures can be effectively calculated when individual human and machine agent states are clearly defined. This research is limited to the operations phase of a system lifecycle process in an aerospace environment only. Further exploration of the proposed agent-based and human decision-making model will be required to allow for a greater understanding of this topic for application outside of the operations domain.Keywords: agent-based model, complex human-machine system, human decision-making model, system reliability assessment
Procedia PDF Downloads 16920566 The Beta-Fisher Snedecor Distribution with Applications to Cancer Remission Data
Authors: K. A. Adepoju, O. I. Shittu, A. U. Chukwu
Abstract:
In this paper, a new four-parameter generalized version of the Fisher Snedecor distribution called Beta- F distribution is introduced. The comprehensive account of the statistical properties of the new distributions was considered. Formal expressions for the cumulative density function, moments, moment generating function and maximum likelihood estimation, as well as its Fisher information, were obtained. The flexibility of this distribution as well as its robustness using cancer remission time data was demonstrated. The new distribution can be used in most applications where the assumption underlying the use of other lifetime distributions is violated.Keywords: fisher-snedecor distribution, beta-f distribution, outlier, maximum likelihood method
Procedia PDF Downloads 34820565 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics
Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo
Abstract:
Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model
Procedia PDF Downloads 15720564 A Fuzzy Structural Equation Model for Development of a Safety Performance Index Assessment Tool in Construction Sites
Authors: Murat Gunduz, Mustafa Ozdemir
Abstract:
In this research, a framework is to be proposed to model the safety performance in construction sites. Determinants of safety performance are to be defined through extensive literature review and a multidimensional safety performance model is to be developed. In this context, a questionnaire is to be administered to construction companies with sites. The collected data through questionnaires including linguistic terms are then to be defuzzified to get concrete numbers by using fuzzy set theory which provides strong and significant instruments for the measurement of ambiguities and provides the opportunity to meaningfully represent concepts expressed in the natural language. The validity of the proposed safety performance model, relationships between determinants of safety performance are to be analyzed using the structural equation modeling (SEM) which is a highly strong multi variable analysis technique that makes possible the evaluation of latent structures. After validation of the model, a safety performance index assessment tool is to be proposed by the help of software. The proposed safety performance assessment tool will be based on the empirically validated theoretical model.Keywords: Fuzzy set theory, safety performance assessment, safety index, structural equation modeling (SEM), construction sites
Procedia PDF Downloads 52620563 Lie Symmetry of a Nonlinear System Characterizing Endemic Malaria
Authors: Maba Boniface Matadi
Abstract:
This paper analyses the model of Malaria endemic from the point of view of the group theoretic approach. The study identified new independent variables that lead to the transformation of the nonlinear model. Furthermore, corresponding determining equations were constructed, and new symmetries were found. As a result, the findings of the study demonstrate of the integrability of the model to present an invariant solution for the Malaria model.Keywords: group theory, lie symmetry, invariant solutions, malaria
Procedia PDF Downloads 11020562 Parameter Estimation of Gumbel Distribution with Maximum-Likelihood Based on Broyden Fletcher Goldfarb Shanno Quasi-Newton
Authors: Dewi Retno Sari Saputro, Purnami Widyaningsih, Hendrika Handayani
Abstract:
Extreme data on an observation can occur due to unusual circumstances in the observation. The data can provide important information that can’t be provided by other data so that its existence needs to be further investigated. The method for obtaining extreme data is one of them using maxima block method. The distribution of extreme data sets taken with the maxima block method is called the distribution of extreme values. Distribution of extreme values is Gumbel distribution with two parameters. The parameter estimation of Gumbel distribution with maximum likelihood method (ML) is difficult to determine its exact value so that it is necessary to solve the approach. The purpose of this study was to determine the parameter estimation of Gumbel distribution with quasi-Newton BFGS method. The quasi-Newton BFGS method is a numerical method used for nonlinear function optimization without constraint so that the method can be used for parameter estimation from Gumbel distribution whose distribution function is in the form of exponential doubel function. The quasi-New BFGS method is a development of the Newton method. The Newton method uses the second derivative to calculate the parameter value changes on each iteration. Newton's method is then modified with the addition of a step length to provide a guarantee of convergence when the second derivative requires complex calculations. In the quasi-Newton BFGS method, Newton's method is modified by updating both derivatives on each iteration. The parameter estimation of the Gumbel distribution by a numerical approach using the quasi-Newton BFGS method is done by calculating the parameter values that make the distribution function maximum. In this method, we need gradient vector and hessian matrix. This research is a theory research and application by studying several journals and textbooks. The results of this study obtained the quasi-Newton BFGS algorithm and estimation of Gumbel distribution parameters. The estimation method is then applied to daily rainfall data in Purworejo District to estimate the distribution parameters. This indicates that the high rainfall that occurred in Purworejo District decreased its intensity and the range of rainfall that occurred decreased.Keywords: parameter estimation, Gumbel distribution, maximum likelihood, broyden fletcher goldfarb shanno (BFGS)quasi newton
Procedia PDF Downloads 32620561 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 19920560 Best Responses for the Dynamic Model of Hotel Room Rate
Authors: Xuan Tran
Abstract:
The purpose of this paper is to present a comprehensive dynamic model for pricing strategies in the hotel competition to find a win-win situation for the competitive set. By utilizing the Cobb-Douglas utility model, the study establishes room rates by analyzing the price elasticity of demand across a competitive set of four hotels, with a focus on occupancy rates. To further enhance the analysis, game theory is applied to identify the best response for each competitive party, which illustrates the optimal pricing strategy for each hotel in the competitive landscape. This approach offers valuable insights into how hotels can strategically adjust their room rates in response to market conditions and competitor actions. The primary contributions of this research include as follows: (1) advantages for both individual hotels and the broader competitive hotel market, (2) benefits for hotel management overseeing multiple brands, and (3) positive impacts on the local community.Keywords: dynamic model, game theory, best response, Cobb-Douglas
Procedia PDF Downloads 2420559 A Constitutive Model for Time-Dependent Behavior of Clay
Authors: T. N. Mac, B. Shahbodaghkhan, N. Khalili
Abstract:
A new elastic-viscoplastic (EVP) constitutive model is proposed for the analysis of time-dependent behavior of clay. The proposed model is based on the bounding surface plasticity and the concept of viscoplastic consistency framework to establish continuous transition from plasticity to rate dependent viscoplasticity. Unlike the overstress based models, this model will meet the consistency condition in formulating the constitutive equation for EVP model. The procedure of deriving the constitutive relationship is also presented. Simulation results and comparisons with experimental data are then presented to demonstrate the performance of the model.Keywords: bounding surface, consistency theory, constitutive model, viscosity
Procedia PDF Downloads 49320558 Extending the Theory of Planned Behaviour to Predict Intention to Commute by Bicycle: Case Study of Mexico City
Authors: Magda Cepeda, Frances Hodgson, Ann Jopson
Abstract:
There are different barriers people face when choosing to cycle for commuting purposes. This study examined the role of psycho-social factors predicting the intention to cycle to commute in Mexico City. An extended version of the theory of planned behaviour was developed and utilized with a simple random sample of 401 road users. We applied exploratory and confirmatory factor analysis and after identifying five factors, a structural equation model was estimated to find the relationships among the variables. The results indicated that cycling attributes, attitudes to cycling, social comparison and social image and prestige were the most important factors influencing intention to cycle. Although the results from this study are specific to Mexico City, they indicate areas of interest to transportation planners in other regions especially in those cities where intention to cycle its linked to its perceived image and there is political ambition to instigate positive cycling cultures. Moreover, this study contributes to the current literature developing applications of the Theory of Planned Behaviour.Keywords: cycling, latent variable model, perception, theory of planned behaviour
Procedia PDF Downloads 35420557 Factors of Social Network Platform Usage and Privacy Risk: A Unified Theory of Acceptance and Use of Technology2 Model
Abstract:
The trust and use of social network platforms by users are instrumental factors that contribute to the platform’s sustainable development. Studying the influential factors of the use of social network platforms is beneficial for developing and maintaining a large user base. This study constructed an extended unified theory of acceptance and use of technology (UTAUT2) moderating model with perceived privacy risks to analyze the factors affecting the trust and use of social network platforms. 444 participants completed our 35 surveys, and we verified the survey results by structural equation model. Empirical results reveal the influencing factors that affect the trust and use of social network platforms, and the extended UTAUT2 model with perceived privacy risks increases the applicability of UTAUT2 in social network scenarios. Social networking platforms can increase their use rate by increasing the economics, functionality, entertainment, and privacy security of the platform.Keywords: perceived privacy risk, social network, trust, use, UTAUT2 model
Procedia PDF Downloads 9920556 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods
Authors: Autcha Araveeporn
Abstract:
This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution
Procedia PDF Downloads 357