Search results for: elaboration likelihood model theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19966

Search results for: elaboration likelihood model theory

19876 EarlyWarning for Financial Stress Events:A Credit-Regime Switching Approach

Authors: Fuchun Li, Hong Xiao

Abstract:

We propose a new early warning model for predicting financial stress events for a given future time. In this model, we examine whether credit conditions play an important role as a nonlinear propagator of shocks when predicting the likelihood of occurrence of financial stress events for a given future time. This propagation takes the form of a threshold regression in which a regime change occurs if credit conditions cross a critical threshold. Given the new early warning model for financial stress events, we evaluate the performance of this model and currently available alternatives, such as the model from signal extraction approach, and linear regression model. In-sample forecasting results indicate that the three types of models are useful tools for predicting financial stress events while none of them outperforms others across all criteria considered. The out-of-sample forecasting results suggest that the credit-regime switching model performs better than the two others across all criteria and all forecasting horizons considered.

Keywords: cut-off probability, early warning model, financial crisis, financial stress, regime-switching model, forecasting horizons

Procedia PDF Downloads 412
19875 Parameter Estimation for the Mixture of Generalized Gamma Model

Authors: Wikanda Phaphan

Abstract:

Mixture generalized gamma distribution is a combination of two distributions: generalized gamma distribution and length biased generalized gamma distribution. These two distributions were presented by Suksaengrakcharoen and Bodhisuwan in 2014. The findings showed that probability density function (pdf) had fairly complexities, so it made problems in estimating parameters. The problem occurred in parameter estimation was that we were unable to calculate estimators in the form of critical expression. Thus, we will use numerical estimation to find the estimators. In this study, we presented a new method of the parameter estimation by using the expectation – maximization algorithm (EM), the conjugate gradient method, and the quasi-Newton method. The data was generated by acceptance-rejection method which is used for estimating α, β, λ and p. λ is the scale parameter, p is the weight parameter, α and β are the shape parameters. We will use Monte Carlo technique to find the estimator's performance. Determining the size of sample equals 10, 30, 100; the simulations were repeated 20 times in each case. We evaluated the effectiveness of the estimators which was introduced by considering values of the mean squared errors and the bias. The findings revealed that the EM-algorithm had proximity to the actual values determined. Also, the maximum likelihood estimators via the conjugate gradient and the quasi-Newton method are less precision than the maximum likelihood estimators via the EM-algorithm.

Keywords: conjugate gradient method, quasi-Newton method, EM-algorithm, generalized gamma distribution, length biased generalized gamma distribution, maximum likelihood method

Procedia PDF Downloads 198
19874 Developing HRCT Criterion to Predict the Risk of Pulmonary Tuberculosis

Authors: Vandna Raghuvanshi, Vikrant Thakur, Anupam Jhobta

Abstract:

Objective: To design HRCT criterion to forecast the threat of pulmonary tuberculosis. Material and methods: This was a prospective study of 69 patients with clinical suspicion of pulmonary tuberculosis. We studied their medical characteristics, numerous separate HRCT-results, and a combination of HRCT findings to foresee the danger for PTB by utilizing univariate and multivariate investigation. Temporary HRCT diagnostic criteria were planned in view of these outcomes to find out the risk of PTB and tested these criteria on our patients. Results: The results of HRCT chest were analyzed, and Rank was given from 1 to 4 according to the HRCT chest findings. Sensitivity, specificity, positive predictive value, and negative predictive value were calculated. Rank 1: Highly suspected PTB. Rank 2: Probable PTB Rank 3: Nonspecific or difficult to differentiate from other diseases Rank 4: Other suspected diseases • Rank 1 (Highly suspected TB) was present in 22 (31.9%) patients, all of them finally diagnosed to have pulmonary tuberculosis. The sensitivity, specificity, and negative likelihood ratio for RANK 1 on HRCT chest was 53.6%, 100%, and 0.43, respectively. • Rank 2 (Probable TB) was present in 13 patients, out of which 12 were tubercular, and 1 was non-tubercular. • The sensitivity, specificity, positive likelihood ratio, and negative likelihood ratio of the combination of Rank 1 and Rank 2 was 82.9%, 96.4%, 23.22, and 0.18, respectively. • Rank 3 (Non-specific TB) was present in 25 patients, and out of these, 7 were tubercular, and 18 were non-tubercular. • When all these 3 ranks were considered together, the sensitivity approached 100% however, the specificity reduced to 35.7%. The positive likelihood ratio and negative likelihood ratio were 1.56 and 0, respectively. • Rank 4 (Other specific findings) was given to 9 patients, and all of these were non-tubercular. Conclusion: HRCT is useful in selecting individuals with greater chances of pulmonary tuberculosis.

Keywords: pulmonary, tuberculosis, multivariate, HRCT

Procedia PDF Downloads 140
19873 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 127
19872 A Review of Existing Turnover Intention Theories

Authors: Pauline E. Ngo-Henha

Abstract:

Existing turnover intention theories are reviewed in this paper. This review was conducted with the help of the search keyword “turnover intention theories” in Google Scholar during the month of July 2017. These theories include: The Theory of Organizational Equilibrium (TOE), Social Exchange Theory, Job Embeddedness Theory, Herzberg’s Two-Factor Theory, the Resource-Based View, Equity Theory, Human Capital Theory, and the Expectancy Theory. One of the limitations of this review paper is that data were only collected from Google Scholar where many papers were sometimes not freely accessible. However, this paper attempts to contribute to the research in clarifying the distinction between theories and models in the context of turnover intention.

Keywords: Literature Review, Theory, Turnover, Turnover intention

Procedia PDF Downloads 414
19871 Application of Generalized Autoregressive Score Model to Stock Returns

Authors: Katleho Daniel Makatjane, Diteboho Lawrence Xaba, Ntebogang Dinah Moroke

Abstract:

The current study investigates the behaviour of time-varying parameters that are based on the score function of the predictive model density at time t. The mechanism to update the parameters over time is the scaled score of the likelihood function. The results revealed that there is high persistence of time-varying, as the location parameter is higher and the skewness parameter implied the departure of scale parameter from the normality with the unconditional parameter as 1.5. The results also revealed that there is a perseverance of the leptokurtic behaviour in stock returns which implies the returns are heavily tailed. Prior to model estimation, the White Neural Network test exposed that the stock price can be modelled by a GAS model. Finally, we proposed further researches specifically to model the existence of time-varying parameters with a more detailed model that encounters the heavy tail distribution of the series and computes the risk measure associated with the returns.

Keywords: generalized autoregressive score model, South Africa, stock returns, time-varying

Procedia PDF Downloads 477
19870 Principles of Teaching for Successful Intelligence

Authors: Shabnam

Abstract:

The purpose of this study was to see importance of successful intelligence in education which can enhance achievement. There are a number of researches which have tried to apply psychological theories of education and many researches emphasized the role of thinking and intelligence. While going through the various researches, it was found that many students could learn more effectively than they do, if they were taught in a way that better matched their patterns of abilities. Attempts to apply psychological theories to education can falter on the translation of the theory into educational practice. Often, this translation is not clear. Therefore, when a program does not succeed, it is not clear whether the lack of success was due to the inadequacy of the theory or the inadequacy of the implementation of the theory. A set of basic principles for translating a theory into practice can help clarify just what an educational implementation should (and should not) look like. Sternberg’s theory of successful intelligence; analytical, creative and practical intelligence provides a way to create such a match. The results suggest that theory of successful intelligence provides successful interventions in classrooms and provides a proven model for gifted education. This article presents principles for translating a triarchic theory of successful intelligence into educational practice.

Keywords: successful intelligence, analytical, creative and practical intelligence, achievement, success, resilience

Procedia PDF Downloads 563
19869 Gravitational Frequency Shifts for Photons and Particles

Authors: Jing-Gang Xie

Abstract:

The research, in this case, considers the integration of the Quantum Field Theory and the General Relativity Theory. As two successful models in explaining behaviors of particles, they are incompatible since they work at different masses and scales of energy, with the evidence that regards the description of black holes and universe formation. It is so considering previous efforts in merging the two theories, including the likes of the String Theory, Quantum Gravity models, and others. In a bid to prove an actionable experiment, the paper’s approach starts with the derivations of the existing theories at present. It goes on to test the derivations by applying the same initial assumptions, coupled with several deviations. The resulting equations get similar results to those of classical Newton model, quantum mechanics, and general relativity as long as conditions are normal. However, outcomes are different when conditions are extreme, specifically with no breakdowns even for less than Schwarzschild radius, or at Planck length cases. Even so, it proves the possibilities of integrating the two theories.

Keywords: general relativity theory, particles, photons, Quantum Gravity Model, gravitational frequency shift

Procedia PDF Downloads 333
19868 The Effect of Culture and Managerial Practices on Organizational Leadership Towards Performance

Authors: Anyia Nduka, Aslan Bin Amad Senin, Ayu Azrin Bte Abdul Aziz

Abstract:

A management practice characterised by a value chain as its relatively flexible culture is replacing the old bureaucratic model of organisational practice that was built on dominance. Using a management practice fruition paradigm, the study delves into the implications of organisational culture and leadership. Developing a theory of leadership called the “cultural model” of organisational leadership by explaining how the shift from bureaucracy to management practises altered the roles and interactions of leaders. This model is well-grounded in leadership theory, considering the concept's adaptability to different leadership ideologies. In organisations where operational procedures and borders are not clearly defined, hierarchies are flattened, and work collaborations are sometimes based on contracts rather than employment. This cultural model of organizational leadership is intended to be a useful tool for predicting how effectively a leader will perform.

Keywords: leadership, organizational culture, management practices, efficiency

Procedia PDF Downloads 45
19867 Lobbyists’ Competencies as a Basis for Shaping the Positive Image of Modern Lobbying

Authors: Joanna Dzieńdziora

Abstract:

Lobbying is an instrument of influence in various decision-making processes. It is also the underestimated issue as a research problem. The lack of research on the modern lobbyist competencies is the most crucial element. The paper presents attempts of finding answers to the following questions: Who should run the lobbying activity? What competencies should a lobbyist possess in order to implement lobbying activities effectively? Searching for answers for the mentioned above questions requires positioning the opportunity to change the image of lobbying in the area of competencies of entities that provide lobbying activities. The aim of the paper is presenting the lobbyist competencies profile in the framework of his professional role. The essence of lobbying activity and its significance in the modern economy as well as areas, the scope of lobbying activities, diagnosis of a modern lobbyist’s competences, lobbyist’s competencies profile that is focused on the professionalization of the lobbying activity, will have been presented in this paper. Indicated research tasks let emerge lobbyist’s competencies in the way that allows identifying and elaborating the lobbyist competencies profile. The profile lets improve lobbying activities. Its elaboration is based on the author’s research results analysis. Taking into consideration the shortages within the theory and research on the lobbying activity, the implementation of this research enables to fill the cognitive gap existing in the theory of management sciences.

Keywords: competencies, competencies profile, lobbying, lobbyist

Procedia PDF Downloads 122
19866 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence

Procedia PDF Downloads 409
19865 Electro-Hydrodynamic Analysis of Low-Pressure DC Glow Discharge by Lattice Boltzmann Method

Authors: Ji-Hyok Kim, Il-Gyong Paek, Yong-Jun Kim

Abstract:

We propose a numerical model based on drift-diffusion theory and lattice Boltzmann method (LBM) to analyze the electro-hydrodynamic behavior in low-pressure direct current (DC) glow discharge plasmas. We apply the drift-diffusion theory for 4-species and employ the standard lattice Boltzmann model (SLBM) for the electron, the finite difference-lattice Boltzmann model (FD-LBM) for heavy particles, and the finite difference model (FDM) for the electric potential, respectively. Our results are compared with those of other methods, and emphasize the necessity of a two-dimensional analysis for glow discharge.

Keywords: glow discharge, lattice Boltzmann method, numerical analysis, plasma simulation, electro-hydrodynamic

Procedia PDF Downloads 55
19864 Application the Queuing Theory in the Warehouse Optimization

Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova

Abstract:

The aim of optimization of store management is not only designing the situation of store management itself including its equipment, technology and operation. In optimization of store management we need to consider also synchronizing of technological, transport, store and service operations throughout the whole process of logistic chain in such a way that a natural flow of material from provider to consumer will be achieved the shortest possible way, in the shortest possible time in requested quality and quantity and with minimum costs. The paper deals with the application of the queuing theory for optimization of warehouse processes. The first part refers to common information about the problematic of warehousing and using mathematical methods for logistics chains optimization. The second part refers to preparing a model of a warehouse within queuing theory. The conclusion of the paper includes two examples of using queuing theory in praxis.

Keywords: queuing theory, logistics system, mathematical methods, warehouse optimization

Procedia PDF Downloads 560
19863 A New Fuzzy Fractional Order Model of Transmission of Covid-19 With Quarantine Class

Authors: Asma Hanif, A. I. K. Butt, Shabir Ahmad, Rahim Ud Din, Mustafa Inc

Abstract:

This paper is devoted to a study of the fuzzy fractional mathematical model reviewing the transmission dynamics of the infectious disease Covid-19. The proposed dynamical model consists of susceptible, exposed, symptomatic, asymptomatic, quarantine, hospitalized and recovered compartments. In this study, we deal with the fuzzy fractional model defined in Caputo’s sense. We show the positivity of state variables that all the state variables that represent different compartments of the model are positive. Using Gronwall inequality, we show that the solution of the model is bounded. Using the notion of the next-generation matrix, we find the basic reproduction number of the model. We demonstrate the local and global stability of the equilibrium point by using the concept of Castillo-Chavez and Lyapunov theory with the Lasalle invariant principle, respectively. We present the results that reveal the existence and uniqueness of the solution of the considered model through the fixed point theorem of Schauder and Banach. Using the fuzzy hybrid Laplace method, we acquire the approximate solution of the proposed model. The results are graphically presented via MATLAB-17.

Keywords: Caputo fractional derivative, existence and uniqueness, gronwall inequality, Lyapunov theory

Procedia PDF Downloads 77
19862 Estimation of Stress-Strength Parameter for Burr Type XII Distribution Based on Progressive Type-II Censoring

Authors: A. M. Abd-Elfattah, M. H. Abu-Moussa

Abstract:

In this paper, the estimation of stress-strength parameter R = P(Y < X) is considered when X; Y the strength and stress respectively are two independent random variables of Burr Type XII distribution. The samples taken for X and Y are progressively censoring of type II. The maximum likelihood estimator (MLE) of R is obtained when the common parameter is unknown. But when the common parameter is known the MLE, uniformly minimum variance unbiased estimator (UMVUE) and the Bayes estimator of R = P(Y < X) are obtained. The exact con dence interval of R based on MLE is obtained. The performance of the proposed estimators is compared using the computer simulation.

Keywords: Burr Type XII distribution, progressive type-II censoring, stress-strength model, unbiased estimator, maximum-likelihood estimator, uniformly minimum variance unbiased estimator, confidence intervals, Bayes estimator

Procedia PDF Downloads 428
19861 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: collision, impact models, finite element method, Hertz Theory

Procedia PDF Downloads 149
19860 Leadership Strategies in Social Enterprises through Reverse Accountability: Analysis of Social Control for Pragmatic Organizational Design

Authors: Ananya Rajagopal

Abstract:

The study is based on an analysis of qualitative data used to analyze the business performance of entrepreneurs in emerging markets based on core variables such as collective leadership in reference to social entrepreneurship and reverse accountability attributes of stakeholders. In-depth interviews were conducted with 25 emerging enterprises within Mexico across five industrial segments. The study has been conducted focusing on five major research questions, which helped in developing the grounded theory related to reverser accountability. The results of the study revealed that the traditional entrepreneurship model based on an individualistic leadership style is being replaced by a collective leadership model. The study focuses on the leadership styles within social enterprises aimed at enhancing managerial capabilities and competencies, stakeholder values, and entrepreneurial growth. The theoretical motivation of this study has been derived from stakeholder theory and agency theory.

Keywords: reverse accountability, social enterprises, collective leadership, grounded theory, social governance

Procedia PDF Downloads 95
19859 Detection of Change Points in Earthquakes Data: A Bayesian Approach

Authors: F. A. Al-Awadhi, D. Al-Hulail

Abstract:

In this study, we applied the Bayesian hierarchical model to detect single and multiple change points for daily earthquake body wave magnitude. The change point analysis is used in both backward (off-line) and forward (on-line) statistical research. In this study, it is used with the backward approach. Different types of change parameters are considered (mean, variance or both). The posterior model and the conditional distributions for single and multiple change points are derived and implemented using BUGS software. The model is applicable for any set of data. The sensitivity of the model is tested using different prior and likelihood functions. Using Mb data, we concluded that during January 2002 and December 2003, three changes occurred in the mean magnitude of Mb in Kuwait and its vicinity.

Keywords: multiple change points, Markov Chain Monte Carlo, earthquake magnitude, hierarchical Bayesian mode

Procedia PDF Downloads 430
19858 The Economics of Justice as Fairness

Authors: Antonio Abatemarco, Francesca Stroffolini

Abstract:

In the economic literature, Rawls’ Theory of Justice is usually interpreted in a two-stage setting, where a priority to the worst off individual is imposed as a distributive value judgment. In this paper, instead, we model Rawls’ Theory in a three-stage setting, that is, a separating line is drawn between the original position, the educational stage, and the working life. Hence, in this paper, we challenge the common interpretation of Rawls’ Theory of Justice as Fairness by showing that this Theory goes well beyond the definition of a distributive value judgment, in such a way as to embrace efficiency issues as well. In our model, inequalities are shown to be permitted as far as they stimulate a greater effort in education in the population, and so economic growth. To our knowledge, this is the only possibility for the inequality to be ‘bought’ by both the most-, and above all, the least-advantaged individual as suggested by the Difference Principle. Finally, by recalling the old tradition of ‘universal ex-post efficiency’, we show that a unique optimal social contract does not exist behind the veil of ignorance; more precisely, the sole set of potentially Rawls-optimal social contracts can be identified a priori, and partial justice orderings derived accordingly.

Keywords: justice, Rawls, inequality, social contract

Procedia PDF Downloads 191
19857 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, maturity model, systems theory, viable system model

Procedia PDF Downloads 287
19856 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm

Authors: Suparman Suparman

Abstract:

A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.

Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)

Procedia PDF Downloads 331
19855 A Unified Theory of the Primary Psychological and Social Sciences

Authors: George McMillan

Abstract:

This paper introduces the methodology to create a baseline equation for the philosophical and social sciences in the behavioral-political-economic-demographic sequence. The two major ideological political-economic philosophies (Hume-Smith and Marx-Engels) are systematized into competing integrated three dimensional behavioral-political-economic models. The paper argues that Hume-Smith’s empathy-sympathy behavioral assumptions are a sufficient starting point to create the integrated causal model sought by Tooby and Cosmides. The author then shows that the prerequisite advances in psychology and demographic studies now exist to generate the universal economic theory sought by von Neumann-Morgenstern and the integrated behavioral-economic method of Gintis—a psychological (i.e., behavioral) socio-economic model. By updating Hume-Smith’s work with a modern understanding of psychology, as presented by Fromm and others, a new integrated societal model as postulated by Harsanyi can be created that intertwines the social and psychological sciences. The author argues that this fundamentally psychology-based model also can serve as a baseline equation for all social sciences as desired by Kant and Mach, as well as the ahistorical (psychological) philosophic model noted by Husserl, Heidegger, Tillich, and Strauss. The author concludes with a discussion of the necessary next steps to generating a detailed model that fuses these disciplines.

Keywords: Unified Social Theory

Procedia PDF Downloads 351
19854 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 434
19853 An Empirical Investigation of Mobile Banking Services Adoption in Pakistan

Authors: Aijaz A. Shaikh, Richard Glavee-Geo, Heikki Karjaluoto

Abstract:

Adoption of Information Systems (IS) is receiving increasing attention such that its implications have been closely monitored and studied by the IS management community, industry and professional gatekeepers. Building on previous research regarding the adoption of technology, this paper develops and validates an integrated model of the adoption of mobile banking. The model originates from the Technology Acceptance Model (TAM) and the Theory of Planned Behaviour (TPB). This paper intends to offer a preliminary scrutiny of the antecedents of the adoption of mobile banking services in the context of a developing country. Data was collected from Pakistan. The findings showed that an integrated TAM and TPB model greatly explains the adoption intention of mobile banking; and perceived behavioural control and its antecedents play a significant role in predicting adoption Theoretical and managerial implications of findings are presented and discussed.

Keywords: developing country, mobile banking service adoption, technology acceptance model, theory of planned behavior

Procedia PDF Downloads 391
19852 Innovation and Economic Growth Model of East Asian Countries: The Adaptability of the Model in Ethiopia

Authors: Khalid Yousuf Ahmed

Abstract:

At the beginning of growth period, East Asian countries achieved impressive economic growth for the decades. They transformed from agricultural economy toward industrialization and contributed to dynamic structural transformation. The achievements were driven by government-led development policies that implemented effective innovation policy to boost technological capability of local firms. Recently, most Sub-Saharan African have been showing sustainable growth. Exceptionally, Ethiopia has been recording double-digit growth for a decade. Hence, Ethiopia has claimed to follow the footstep of East Asia development model. The study is going to examine whether Ethiopia can replicate innovation and economic growth model of East Asia by using Japan, Taiwan, South Korea and China as a case to illustrate their model of growth. This research will be based on empirical data gathering and extended theory of national innovation system and economic growth theory. Moreover, the methodology is based on Knowledge Assessment Methodology (KAM) and also employing cross-countries regression analysis. The results explained that there is a significant relationship between innovation indicators and economic growth in East Asian countries while the relationship is non-existing for Ethiopia except implementing similar policies and achieving similar growth trend. Therefore, Ethiopia needs to introduce inclusive policies that give priority to improving human capital and invest on the knowledge-based economy to replicate East Asian Model.

Keywords: economic growth, FDI, endogenous growth theory, East Asia model

Procedia PDF Downloads 229
19851 Frequency Analysis of Minimum Ecological Flow and Gage Height in Indus River Using Maximum Likelihood Estimation

Authors: Tasir Khan, Yejuan Wan, Kalim Ullah

Abstract:

Hydrological frequency analysis has been conducted to estimate the minimum flow elevation of the Indus River in Pakistan to protect the ecosystem. The Maximum likelihood estimation (MLE) technique is used to estimate the best-fitted distribution for Minimum Ecological Flows at nine stations of the Indus River in Pakistan. The four selected distributions, Generalized Extreme Value (GEV) distribution, Generalized Logistics (GLO) distribution, Generalized Pareto (GPA) distribution, and Pearson type 3 (PE3) are fitted in all sites, usually used in hydro frequency analysis. Compare the performance of these distributions by using the goodness of fit tests, such as the Kolmogorov Smirnov test, Anderson darling test, and chi-square test. The study concludes that the Maximum Likelihood Estimation (MLE) method recommended that GEV and GPA are the most suitable distributions which can be effectively applied to all the proposed sites. The quantiles are estimated for the return periods from 5 to 1000 years by using MLE, estimations methods. The MLE is the robust method for larger sample sizes. The results of these analyses can be used for water resources research, including water quality management, designing irrigation systems, determining downstream flow requirements for hydropower, and the impact of long-term drought on the country's aquatic system.

Keywords: minimum ecological flow, frequency distribution, indus river, maximum likelihood estimation

Procedia PDF Downloads 55
19850 The Moderating Role of Payment Platform Applications’ Relationship with Increasing Purchase Intention Among Customers in Kuwait - Unified Theory of Acceptance and Sustainable Use of Technology Model

Authors: Ahmad Alsaber

Abstract:

This paper aims to understand the intermediary role of the payment platform applications by analyzing the various factors that can influence the desirability of utilizing said payment services in Kuwait, as well as to determine the effect of the presence of different types of payment platforms on the variables of the “Unified Theory of Acceptance and Use of Technology” (UTAUT) model. The UTAUT model's findings will provide an important understanding of the moderating role of payment platform mobile applications. This study will explore the influence of payment platform mobile applications on customer purchase intentions in Kuwait by employing a quantitative survey of 200 local customers. Questions will cover their usage of payment platforms, purchase intent, and overall satisfaction. The information gathered is then analyzed using descriptive statistics and correlation analysis in order to gain insights. The research hopes to provide greater insight into the effect of mobile payment platforms on customer purchase intentions in Kuwait. This research will provide important implications to marketers and customer service providers, informing their strategies and initiatives, as well as offer recommendations to payment platform providers on how to improve customer satisfaction and security. The study results suggest that the likelihood of a purchase is affected by performance expectancy, effort expectancy, social influence, risk, and trust. The purpose of this research is to understand the advancements in the different variables that Kuwaiti customers consider while dealing with mobile banking applications. With the implementation of stronger security measures, progressively more payment platform applications are being utilized in the Kuwaiti marketplace, making them more desirable with their accessibility and usability. With the development of the Kuwaiti digital economy, it is expected that mobile banking will have a greater impact on banking transactions and services in the future.

Keywords: purchase intention, UTAUT, performance expectancy, social influence, risk, trust

Procedia PDF Downloads 65
19849 A Game-Theory-Based Price-Optimization Algorithm for the Simulation of Markets Using Agent-Based Modelling

Authors: Juan Manuel Sanchez-Cartas, Gonzalo Leon

Abstract:

A price competition algorithm for ABMs based on game theory principles is proposed to deal with the simulation of theoretical market models. The algorithm is applied to the classical Hotelling’s model and to a two-sided market model to show it leads to the optimal behavior predicted by theoretical models. However, when theoretical models fail to predict the equilibrium, the algorithm is capable of reaching a feasible outcome. Results highlight that the algorithm can be implemented in other simulation models to guarantee rational users and endogenous optimal behaviors. Also, it can be applied as a tool of verification given that is theoretically based.

Keywords: agent-based models, algorithmic game theory, multi-sided markets, price optimization

Procedia PDF Downloads 414
19848 A Study of Mode Choice Model Improvement Considering Age Grouping

Authors: Young-Hyun Seo, Hyunwoo Park, Dong-Kyu Kim, Seung-Young Kho

Abstract:

The purpose of this study is providing an improved mode choice model considering parameters including age grouping of prime-aged and old age. In this study, 2010 Household Travel Survey data were used and improper samples were removed through the analysis. Chosen alternative, date of birth, mode, origin code, destination code, departure time, and arrival time are considered from Household Travel Survey. By preprocessing data, travel time, travel cost, mode, and ratio of people aged 45 to 55 years, 55 to 65 years and over 65 years were calculated. After the manipulation, the mode choice model was constructed using LIMDEP by maximum likelihood estimation. A significance test was conducted for nine parameters, three age groups for three modes. Then the test was conducted again for the mode choice model with significant parameters, travel cost variable and travel time variable. As a result of the model estimation, as the age increases, the preference for the car decreases and the preference for the bus increases. This study is meaningful in that the individual and households characteristics are applied to the aggregate model.

Keywords: age grouping, aging, mode choice model, multinomial logit model

Procedia PDF Downloads 301
19847 Recovery of Petroleum Reservoir by Waterflooding Technique

Authors: Zabihullah Mahdi, Khwaja Naweed Seddiqi, Shigeo Honma

Abstract:

Through many types of research and practical studies, it has been identified that the average oil recovery factor of a petroleum reservoir is about 30 to 35 %. This study is focused on enhanced oil recovery by laboratory experiment and graphical investigation based on Buckley-Leverett theory. Horizontal oil displacement by water, in a petroleum reservoir is analyzed under the Buckley-Leverett frontal displacement theory. The extraction and prerequisite of this theory are based and pursued focusing on the key factors that control displacement. The theory is executable to the waterflooding method, which is generally employed in petroleum engineering reservoirs to sustain oil production recovery, and the techniques for evaluating the average water saturation behind the water front and the oil recovery factors in the reservoirs are presented. In this paper, the Buckley-Leverett theory handled to an experimental model and the amount of recoverable oil are investigated to be over 35%. The irreducible water saturation, viz. connate water saturation, in the reservoir is also a significant inspiration for the recovery.

Keywords: Buckley-Leverett theory, waterflooding technique, petroleum engineering, immiscible displacement

Procedia PDF Downloads 218