Search results for: conditional heteroskedasticity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 240

Search results for: conditional heteroskedasticity

120 Xeroderma Pigmentosum Group G: Gene Polymorphism and Risk of Breast Cancer

Authors: Malik SS, Masood N, Mubarik S, Khadim TM

Abstract:

Introduction: Xeroderma pigmentosum group G (XPG) gene plays a crucial role in the correction of UV-induced DNA damage through nucleotide excision repair pathway. Single nucleotide polymorphisms in XPG gene have been reported to be associated with different cancers. Current case-control study was designed to evaluate the relationship between one of the most frequently found XPG (rs1047768 T>C) polymorphism and breast cancer risk. Methodology: A total of 200 individuals were screened for this polymorphism including 100 pathologically confirmed breast cancer cases and age-matched 100 controls. Genotyping was carried out using Tetra amplification-refractory mutation system (ARMS) PCR and results were confirmed by gel electrophoresis. Results: Conditional logistic regression analysis showed significant association between TC genotype (OR: 8.9, CI: 2.0 – 38.7) and increased breast cancer risk. Although homozygous CC genotype was more frequent in patients as compared to controls, but it was statistically non-significant (OR: 3.9, CI: 0.4 – 35.7). Conclusion: In conclusion, XPG (rs1047768 T>C) polymorphism may contribute towards increased risk of breast cancer but other polymorphisms may also be evaluated to elucidate their role in breast cancer.

Keywords: XPG, breast cancer, NER, ARMS-PCR

Procedia PDF Downloads 161
119 The Theory behind Logistic Regression

Authors: Jan Henrik Wosnitza

Abstract:

The logistic regression has developed into a standard approach for estimating conditional probabilities in a wide range of applications including credit risk prediction. The article at hand contributes to the current literature on logistic regression fourfold: First, it is demonstrated that the binary logistic regression automatically meets its model assumptions under very general conditions. This result explains, at least in part, the logistic regression's popularity. Second, the requirement of homoscedasticity in the context of binary logistic regression is theoretically substantiated. The variances among the groups of defaulted and non-defaulted obligors have to be the same across the level of the aggregated default indicators in order to achieve linear logits. Third, this article sheds some light on the question why nonlinear logits might be superior to linear logits in case of a small amount of data. Fourth, an innovative methodology for estimating correlations between obligor-specific log-odds is proposed. In order to crystallize the key ideas, this paper focuses on the example of credit risk prediction. However, the results presented in this paper can easily be transferred to any other field of application.

Keywords: correlation, credit risk estimation, default correlation, homoscedasticity, logistic regression, nonlinear logistic regression

Procedia PDF Downloads 394
118 Spatial Working Memory Is Enhanced by the Differential Outcome Procedure in a Group of Participants with Mild Cognitive Impairment

Authors: Ana B. Vivas, Antonia Ypsilanti, Aristea I. Ladas, Angeles F. Estevez

Abstract:

Mild Cognitive Impairment (MCI) is considered an intermediate stage between normal and pathological aging, as a substantial percentage of people diagnosed with MCI converts later to dementia of the Alzheimer’s type. Memory is of the first cognitive processes to deteriorate in this condition. In the present study we employed the differential outcomes procedure (DOP) to improve visuospatial memory in a group of participants with MCI. The DOP requires the structure of a conditional discriminative learning task in which a correct choice response to a specific stimulus-stimulus association is reinforced with a particular reinforcer or outcome. A group of 10 participants with MCI, and a matched control group had to learn and keep in working memory four target locations out of eight possible locations where a shape could be presented. Results showed that participants with MCI had a statistically significant better terminal accuracy when a unique outcome was paired with a location (76% accuracy) as compared to a non differential outcome condition (64%). This finding suggests that the DOP is useful in improving working memory in MCI patients, which may delay their conversion to dementia.

Keywords: mild cognitive impairment, working memory, differential outcomes, cognitive process

Procedia PDF Downloads 433
117 Volatility Switching between Two Regimes

Authors: Josip Visković, Josip Arnerić, Ante Rozga

Abstract:

Based on the fact that volatility is time varying in high frequency data and that periods of high volatility tend to cluster, the most successful and popular models in modelling time varying volatility are GARCH type models. When financial returns exhibit sudden jumps that are due to structural breaks, standard GARCH models show high volatility persistence, i.e. integrated behaviour of the conditional variance. In such situations models in which the parameters are allowed to change over time are more appropriate. This paper compares different GARCH models in terms of their ability to describe structural changes in returns caused by financial crisis at stock markets of six selected central and east European countries. The empirical analysis demonstrates that Markov regime switching GARCH model resolves the problem of excessive persistence and outperforms uni-regime GARCH models in forecasting volatility when sudden switching occurs in response to financial crisis.

Keywords: central and east European countries, financial crisis, Markov switching GARCH model, transition probabilities

Procedia PDF Downloads 200
116 Risk Spillover Between Stock Indices and Real Estate Mixed Copula Modeling

Authors: Hina Munir Abbasi

Abstract:

The current paper examines the relationship and diversification ability of Islamic stock indices /conventional stocks indices and Real Estate Investment Trust (REITs).To represent conditional dependency between stocks and REITs in a more realistic way, new modeling technique, time-varying copula with switching dependence is used. It represents reliance structure more accurately and realistically than a single copula regime as dependence may alter between positive and negative correlation regimes with time. The fluctuating behavior of markets has significant impact on economic variables; especially the downward trend during crisis. Overall addition of Real Estate Investment Trust in stocks portfolio reduces risks and provide better diversification benefit. Results varied depending upon the circumstances of the country. REITs provides better diversification benefits for Islamic Stocks, when both markets are bearish and can provide hedging benefit for conventional stocks portfolio.

Keywords: conventional stocks, real estate investment trust, copula, diversification, risk spillover, safe heaven

Procedia PDF Downloads 49
115 Organizational Climate being Knowledge Sharing Oriented: A Fuzzy-Set Analysis

Authors: Paulo Lopes Henriques, Carla Curado

Abstract:

According to literature, knowledge sharing behaviors are influenced by organizational values and structures, namely organizational climate. The manuscript examines the antecedents of the knowledge sharing oriented organizational climate. According to theoretical expectations the study adopts the following explanatory conditions: knowledge sharing costs, knowledge sharing incentives, perceptions of knowledge sharing contributing to performance and tenure. The study confronts results considering two groups of firms: nondigital (firms without intranet) vs digital (firms with intranet). The paper applies fsQCA technique to analyze data by using fsQCA 2.5 software (www.fsqca.com) testing several conditional arguments to explain the outcome variable. Main results strengthen claims on the relevancy of the contribution of knowledge sharing to performance. Secondly, evidence brings tenure - an explanatory condition that is associated to organizational memory – to the spotlight. The study provides an original contribution not previously addressed in literature, since it identifies the sufficient conditions sets to knowledge sharing oriented organizational climate using fsQCA, which is, to our knowledge, a novel application of the technique.

Keywords: fsQCA, knowledge sharing oriented organizational climate, knowledge sharing costs, knowledge sharing incentives

Procedia PDF Downloads 291
114 Exchange Rate Forecasting by Econometric Models

Authors: Zahid Ahmad, Nosheen Imran, Nauman Ali, Farah Amir

Abstract:

The objective of the study is to forecast the US Dollar and Pak Rupee exchange rate by using time series models. For this purpose, daily exchange rates of US and Pakistan for the period of January 01, 2007 - June 2, 2017, are employed. The data set is divided into in sample and out of sample data set where in-sample data are used to estimate as well as forecast the models, whereas out-of-sample data set is exercised to forecast the exchange rate. The ADF test and PP test are used to make the time series stationary. To forecast the exchange rate ARIMA model and GARCH model are applied. Among the different Autoregressive Integrated Moving Average (ARIMA) models best model is selected on the basis of selection criteria. Due to the volatility clustering and ARCH effect the GARCH (1, 1) is also applied. Results of analysis showed that ARIMA (0, 1, 1 ) and GARCH (1, 1) are the most suitable models to forecast the future exchange rate. Further the GARCH (1,1) model provided the volatility with non-constant conditional variance in the exchange rate with good forecasting performance. This study is very useful for researchers, policymakers, and businesses for making decisions through accurate and timely forecasting of the exchange rate and helps them in devising their policies.

Keywords: exchange rate, ARIMA, GARCH, PAK/USD

Procedia PDF Downloads 516
113 A Comparative Analysis of Geometric and Exponential Laws in Modelling the Distribution of the Duration of Daily Precipitation

Authors: Mounia El Hafyani, Khalid El Himdi

Abstract:

Precipitation is one of the key variables in water resource planning. The importance of modeling wet and dry durations is a crucial pointer in engineering hydrology. The objective of this study is to model and analyze the distribution of wet and dry durations. For this purpose, the daily rainfall data from 1967 to 2017 of the Moroccan city of Kenitra’s station are used. Three models are implemented for the distribution of wet and dry durations, namely the first-order Markov chain, the second-order Markov chain, and the truncated negative binomial law. The adherence of the data to the proposed models is evaluated using Chi-square and Kolmogorov-Smirnov tests. The Akaike information criterion is applied to assess the most effective model distribution. We go further and study the law of the number of wet and dry days among k consecutive days. The calculation of this law is done through an algorithm that we have implemented based on conditional laws. We complete our work by comparing the observed moments of the numbers of wet/dry days among k consecutive days to the calculated moment of the three estimated models. The study shows the effectiveness of our approach in modeling wet and dry durations of daily precipitation.

Keywords: Markov chain, rainfall, truncated negative binomial law, wet and dry durations

Procedia PDF Downloads 89
112 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters

Authors: Rama Debbarma

Abstract:

The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.

Keywords: linear base isolator, earthquake, optimization, uncertain parameters

Procedia PDF Downloads 397
111 Challenges for IoT Adoption in India: A Study Based on Foresight Analysis for 2025

Authors: Shruti Chopra, Vikas Rao Vadi

Abstract:

In the era of the digital world, the Internet of Things (IoT) has been receiving significant attention. Its ubiquitous connectivity between humans, machines to machines (M2M) and machines to humans provides it a potential to transform the society and establish an ecosystem to serve new dimensions to the economy of the country. Thereby, this study has attempted to identify the challenges that seem prevalent in IoT adoption in India through the literature survey. Further, the data has been collected by taking the opinions of experts to conduct the foresight analysis and it has been analyzed with the help of scenario planning process – Micmac, Mactor, Multipol, and Smic-Prob. As a methodology, the study has identified the relationship between variables through variable analysis using Micmac and actor analysis using Mactor, this paper has attempted to generate the entire field of possibilities in terms of hypotheses and construct various scenarios through Multipol. And lastly, the findings of the study include final scenarios that are selected using Smic-Prob by assigning the probability to all the scenarios (including the conditional probability). This study may help the practitioners and policymakers to remove the obstacles to successfully implement the IoT in India.

Keywords: Internet of Thing (IoT), foresight analysis, scenario planning, challenges, policymaking

Procedia PDF Downloads 126
110 Software Reliability Prediction Model Analysis

Authors: Lela Mirtskhulava, Mariam Khunjgurua, Nino Lomineishvili, Koba Bakuria

Abstract:

Software reliability prediction gives a great opportunity to measure the software failure rate at any point throughout system test. A software reliability prediction model provides with the technique for improving reliability. Software reliability is very important factor for estimating overall system reliability, which depends on the individual component reliabilities. It differs from hardware reliability in that it reflects the design perfection. Main reason of software reliability problems is high complexity of software. Various approaches can be used to improve the reliability of software. We focus on software reliability model in this article, assuming that there is a time redundancy, the value of which (the number of repeated transmission of basic blocks) can be an optimization parameter. We consider given mathematical model in the assumption that in the system may occur not only irreversible failures, but also a failure that can be taken as self-repairing failures that significantly affect the reliability and accuracy of information transfer. Main task of the given paper is to find a time distribution function (DF) of instructions sequence transmission, which consists of random number of basic blocks. We consider the system software unreliable; the time between adjacent failures has exponential distribution.

Keywords: exponential distribution, conditional mean time to failure, distribution function, mathematical model, software reliability

Procedia PDF Downloads 436
109 Analysis of the Predictive Performance of Value at Risk Estimations in Times of Financial Crisis

Authors: Alexander Marx

Abstract:

Measuring and mitigating market risk is essential for the stability of enterprises, especially for major banking corporations and investment bank firms. To employ these risk measurement and mitigation processes, the Value at Risk (VaR) is the most commonly used risk metric by practitioners. In the past years, we have seen significant weaknesses in the predictive performance of the VaR in times of financial market crisis. To address this issue, the purpose of this study is to investigate the value-at-risk (VaR) estimation models and their predictive performance by applying a series of backtesting methods on the stock market indices of the G7 countries (Canada, France, Germany, Italy, Japan, UK, US, Europe). The study employs parametric, non-parametric, and semi-parametric VaR estimation models and is conducted during three different periods which cover the most recent financial market crisis: the overall period (2006–2022), the global financial crisis period (2008–2009), and COVID-19 period (2020–2022). Since the regulatory authorities have introduced and mandated the Conditional Value at Risk (Expected Shortfall) as an additional regulatory risk management metric, the study will analyze and compare both risk metrics on their predictive performance.

Keywords: value at risk, financial market risk, banking, quantitative risk management

Procedia PDF Downloads 56
108 Bayesian Using Markov Chain Monte Carlo and Lindley's Approximation Based on Type-I Censored Data

Authors: Al Omari Moahmmed Ahmed

Abstract:

These papers describe the Bayesian Estimator using Markov Chain Monte Carlo and Lindley’s approximation and the maximum likelihood estimation of the Weibull distribution with Type-I censored data. The maximum likelihood method can’t estimate the shape parameter in closed forms, although it can be solved by numerical methods. Moreover, the Bayesian estimates of the parameters, the survival and hazard functions cannot be solved analytically. Hence Markov Chain Monte Carlo method and Lindley’s approximation are used, where the full conditional distribution for the parameters of Weibull distribution are obtained via Gibbs sampling and Metropolis-Hastings algorithm (HM) followed by estimate the survival and hazard functions. The methods are compared to Maximum Likelihood counterparts and the comparisons are made with respect to the Mean Square Error (MSE) and absolute bias to determine the better method in scale and shape parameters, the survival and hazard functions.

Keywords: weibull distribution, bayesian method, markov chain mote carlo, survival and hazard functions

Procedia PDF Downloads 449
107 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics

Authors: Janne Engblom, Elias Oikarinen

Abstract:

A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.

Keywords: dynamic model, fixed effects, panel data, price dynamics

Procedia PDF Downloads 1435
106 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 127
105 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 44
104 Competition, Stability, and Economic Growth: A Causality Approach

Authors: Mahvish Anwaar

Abstract:

Research Question: In this paper, we explore the causal relationship between banking competition, banking stability, and economic growth. Research Findings: The unbalanced panel data starting from 2000 to 2018 is collected to analyze the causality among banking competition, banking stability, and economic growth. The main focus of the study is to check the direction of causality among selected variables. The results of the study support the demand following, supply leading, feedback, and neutrality hypothesis conditional to different measures of banking competition, banking stability, and economic growth. Theoretical Implication: Jayakumar, Pradhan, Dash, Maradana, and Gaurav (2018) proposed a theoretical model of the causal relationship between banking competition, banking stability, and economic growth by using different indicators. So, we empirically test the proposed indicators in our study. This study makes a contribution to the literature by showing the defined relationship between developing and developed countries. Policy Implications: The study covers various policy implications regarding investors to analyze how to properly manage their finances, and government agencies will take help from the present study to find the best and most suitable policies by examining how the economy can grow concerning its finances.

Keywords: competition, stability, economic growth, vector auto-regression, granger causality

Procedia PDF Downloads 32
103 Nurturing Green Creativity in Women Intrapreneurs through Green HRM: Testing Moderated Mediation Model: A Step Towards Saudi Vision 2030

Authors: Tahira Iram, Ahmad Raza Bilal

Abstract:

In 2016, the Kingdom of Saudi Arabia (KSA) initiated Saudi Vision 2030, an ambitious plan to lessen the country's dependency on fossil fuels and increase economic diversification. The Vision 2030 framework strives to establish a thriving economy, a vibrant society, and an ambitious nation. This study aims to investigate the role of green service innovation (SI) and green work engagement (WE) in mediating the nexus between green HRM and green creativity (GC) under the conditional role of spiritual leadership (SL). A survey was done of 300 female intrepreneurs working in the organization within Saudi Arabia. This study has collected data via a stratified random sampling technique. The framework was tested using PLS-SEM software. The findings reveal that WE fully intervenes in the nexus between green HRM and GC. Moreover, SL positively moderates the nexus between green HRM and SI. Thus based on findings, it is recommended that female intrapreneurs prioritize environmentally responsible operations to gain and sustain a competitive edge over rivals in the Saudi competitive market.

Keywords: green HRM, spiritual leadership, Vision 2030, women intrapreneurs, green service innovation behavior, green creativity

Procedia PDF Downloads 43
102 Financial Centers and BRICS Stock Markets: The Effect of the Recent Crises

Authors: Marco Barassi, Nicola Spagnolo

Abstract:

This paper uses a DCC-GARCH model framework to examine mean and volatility spillovers (i.e. causality in mean and variance) dynamics between financial centers and the stock market indexes of the BRICS countries. In addition, tests for changes in the transmission mechanism are carried out by first testing for structural breaks and then setting a dummy variable to control for the 2008 financial crises. We use weekly data for nine countries, four financial centers (Germany, Japan, UK and USA) and the five BRICS countries (Brazil, Russia, India, China and South Africa). Furthermore, we control for monetary policy using domestic interest rates (90-day Treasury Bill interest rate) over the period 03/1/1990 - 04/2/2014, for a total of 1204 observations. Results show that the 2008 financial crises changed the causality dynamics for most of the countries considered. The same pattern can also be observed in conditional correlation showing a shift upward following the turbulence associated to the 2008 crises. The magnitude of these effects suggests a leading role played by the financial centers in effecting Brazil and South Africa, whereas Russia, India and China show a higher degree of resilience.

Keywords: financial crises, DCC-GARCH model, volatility spillovers, economics

Procedia PDF Downloads 331
101 Economic Evaluation of Bowland Shale Gas Wells Development in the UK

Authors: Elijah Acquah-Andoh

Abstract:

The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.

Keywords: capex, economical, investment, profitability, shale gas development, sustainable

Procedia PDF Downloads 551
100 Assessing the Influence of Chinese Stock Market on Indian Stock Market

Authors: Somnath Mukhuti, Prem Kumar Ghosh

Abstract:

Background and significance of the study Indian stock market has undergone sudden changes after the current China crisis in terms of turnover, market capitalization, share prices, etc. The average returns on equity investment in both markets have more than three and half times after global financial crisis owing to the development of industrial activity, corporate sectors development, enhancement in global consumption, change of global financial association and fewer imports from developed countries. But the economic policies of both the economies are far different, that is to say, where Indian economy maintaining a conservative policy, Chinese economy maintaining an aggressive policy. Besides this, Chinese economy recently lowering its currency for increasing mysterious growth but Indian does not. But on August 24, 2015 Indian stock market and world stock markets were fall down due to the reason of Chinese stock market. Keeping in view of the above, this study seeks to examine the influence of Chinese stock on Indian stock market. Methodology This research work is based on daily time series data obtained from yahoo finance database between 2009 (April 1) to 2015 (September 28). This study is based on two important stock markets, that is, Indian stock market (Bombay Stock Exchange) and Chinese stock market (Shanghai Stock Exchange). In the course of analysis, the daily raw data were converted into natural logarithm for minimizing the problem of heteroskedasticity. While tackling the issue, correlation statistics, ADF and PP unit root test, bivariate cointegration test and causality test were used. Major findings Correlation statistics show that both stock markets are associated positively. Both ADF and PP unit root test results demonstrate that the time series data were not normal and were not stationary at level however stationary at 1st difference. The bivariate cointegration test results indicate that the Indian stock market was associated with Chinese stock market in the long-run. The Granger causality test illustrates there was a unidirectional causality between Indian stock market and Chinese stock market. Concluding statement The empirical results recommend that India’s stock market was not very much dependent on Chinese stock market because of Indian economic conservative policies. Nevertheless, Indian stock market might be sturdy if Indian economic policies are changed slightly and if increases the portfolio investment with Chinese economy. Indian economy might be a third largest economy in 2030 if India increases its portfolio investment and trade relations with both Chinese economy and US economy.

Keywords: Indian stock market, China stock market, bivariate cointegration, causality test

Procedia PDF Downloads 346
99 Automatic Tagging and Accuracy in Assamese Text Data

Authors: Chayanika Hazarika Bordoloi

Abstract:

This paper is an attempt to work on a highly inflectional language called Assamese. This is also one of the national languages of India and very little has been achieved in terms of computational research. Building a language processing tool for a natural language is not very smooth as the standard and language representation change at various levels. This paper presents inflectional suffixes of Assamese verbs and how the statistical tools, along with linguistic features, can improve the tagging accuracy. Conditional random fields (CRF tool) was used to automatically tag and train the text data; however, accuracy was improved after linguistic featured were fed into the training data. Assamese is a highly inflectional language; hence, it is challenging to standardizing its morphology. Inflectional suffixes are used as a feature of the text data. In order to analyze the inflections of Assamese word forms, a list of suffixes is prepared. This list comprises suffixes, comprising of all possible suffixes that various categories can take is prepared. Assamese words can be classified into inflected classes (noun, pronoun, adjective and verb) and un-inflected classes (adverb and particle). The corpus used for this morphological analysis has huge tokens. The corpus is a mixed corpus and it has given satisfactory accuracy. The accuracy rate of the tagger has gradually improved with the modified training data.

Keywords: CRF, morphology, tagging, tagset

Procedia PDF Downloads 169
98 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language

Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot

Abstract:

The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.

Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields

Procedia PDF Downloads 57
97 Binary Decision Diagram Based Methods to Evaluate the Reliability of Systems Considering Failure Dependencies

Authors: Siqi Qiu, Yijian Zheng, Xin Guo Ming

Abstract:

In many reliability and risk analysis, failures of components are supposed to be independent. However, in reality, the ignorance of failure dependencies among components may render the results of reliability and risk analysis incorrect. There are two principal ways to incorporate failure dependencies in system reliability and risk analysis: implicit and explicit methods. In the implicit method, failure dependencies can be modeled by joint probabilities, correlation values or conditional probabilities. In the explicit method, certain types of dependencies can be modeled in a fault tree as mutually independent basic events for specific component failures. In this paper, explicit and implicit methods based on BDD will be proposed to evaluate the reliability of systems considering failure dependencies. The obtained results prove the equivalence of the proposed implicit and explicit methods. It is found that the consideration of failure dependencies decreases the reliability of systems. This observation is intuitive, because more components fail due to failure dependencies. The consideration of failure dependencies helps designers to reduce the dependencies between components during the design phase to make the system more reliable.

Keywords: reliability assessment, risk assessment, failure dependencies, binary decision diagram

Procedia PDF Downloads 446
96 How Rational Decision-Making Mechanisms of Individuals Are Corrupted under the Presence of Others and the Reflection of This on Financial Crisis Management Situations

Authors: Gultekin Gurcay

Abstract:

It is known that the most crucial influence of the psychological, social and emotional factors that affect any human behavior is to corrupt the rational decision making mechanism of the individuals and cause them to display irrational behaviors. In this regard, the social context of human beings influences the rationality of our decisions, and people tend to display different behaviors when they were alone compared to when they were surrounded by others. At this point, the interaction and interdependence of the behavioral finance and economics with the area of social psychology comes, where intentions and the behaviors of the individuals are being analyzed in the actual or implied presence of others comes into prominence. Within the context of this study, the prevalent theories of behavioral finance, which are The Prospect Theory, The Utility Theory Given Uncertainty and the Five Axioms of Choice under Uncertainty, Veblen’s Hidden Utility Theory, and the concept of ‘Overreaction’ has been examined and demonstrated; and the meaning, existence and validity of these theories together with the social context has been assessed. Finally, in this study the behavior of the individuals in financial crisis situations where the majority of the society is being affected from the same negative conditions at the same time has been analyzed, by taking into account how individual behavior will change according to the presence of the others.

Keywords: conditional variance coefficient, financial crisis, garch model, stock market

Procedia PDF Downloads 218
95 Vaccination Coverage and Its Associated Factors in India: An ML Approach to Understand the Hierarchy and Inter-Connections

Authors: Anandita Mitro, Archana Srivastava, Bidisha Banerjee

Abstract:

The present paper attempts to analyze the hierarchy and interconnection of factors responsible for the uptake of BCG vaccination in India. The study uses National Family Health Survey (NFHS-5) data which was conducted during 2019-21. The univariate logistic regression method is used to understand the univariate effects while the interconnection effects have been studied using the Categorical Inference Tree (CIT) which is a non-parametric Machine Learning (ML) model. The hierarchy of the factors is further established using Conditional Inference Forest which is an extension of the CIT approach. The results suggest that BCG vaccination coverage was influenced more by system-level factors and awareness than education or socio-economic status. Factors such as place of delivery, antenatal care, and postnatal care were crucial, with variations based on delivery location. Region-specific differences were also observed which could be explained by the factors. Awareness of the disease was less impactful along with the factor of wealth and urban or rural residence, although awareness did appear to substitute for inadequate ANC. Thus, from the policy point of view, it is revealed that certain subpopulations have less prevalence of vaccination which implies that there is a need for population-specific policy action to achieve a hundred percent coverage.

Keywords: vaccination, NFHS, machine learning, public health

Procedia PDF Downloads 25
94 EFL Teachers’ Metacognitive Awareness as a Predictor of Their Professional Success

Authors: Saeedeh Shafiee Nahrkhalaji

Abstract:

Metacognitive knowledge increases EFL students’ ability to be successful learners. Although this relationship has been investigated by a number of scholars, EFL teachers’ explicit awareness of their cognitive knowledge has not been sufficiently explored. The aim of this study was to examine the role of EFL teachers’ metacognitive knowledge in their pedagogical performance. Furthermore, the role played by years of their academic education and teaching experience was also studied. Fifty female EFL teachers were selected. They completed Metacognitive Awareness Inventory (MAI) that assessed six components of metacognition including procedural knowledge, declarative knowledge, conditional knowledge, planning, evaluating, and management strategies. Near the end of the academic semester, the students of each class filled in ‘the Language Teacher Characteristics Questionnaire’ to evaluate their teachers’ pedagogical performance. Four elements of MAI, declarative knowledge, planning, evaluating, and management strategies were found to be significantly correlated with EFL teachers’ pedagogical success. Significant correlation was also established between metacognitive knowledge and EFL teachers’ years of academic education and teaching experience. The findings obtained from this research have contributing implication for EFL teacher educators. The discussion concludes by setting out directions for future research.

Keywords: metacognotive knowledge, pedagogical performance, language teacher characteristics questionnaire, metacognitive awareness inventory

Procedia PDF Downloads 305
93 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries

Authors: Tetsuji Tanaka, Jin Guo

Abstract:

The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.

Keywords: food security, GARCH, grain self-sufficiency, volatility transmission

Procedia PDF Downloads 121
92 Internet of Things Edge Device Power Modelling and Optimization Simulator

Authors: Cian O'Shea, Ross O'Halloran, Peter Haigh

Abstract:

Wireless Sensor Networks (WSN) are Internet of Things (IoT) edge devices. They are becoming widely adopted in many industries, including health care, building energy management, and conditional monitoring. As the scale of WSN deployments increases, the cost and complexity of battery replacement and disposal become more significant and in time may become a barrier to adoption. Harvesting ambient energies provide a pathway to reducing dependence on batteries and in the future may lead to autonomously powered sensors. This work describes a simulation tool that enables the user to predict the battery life of a wireless sensor that utilizes energy harvesting to supplement the battery power. To create this simulator, all aspects of a typical WSN edge device were modelled including, sensors, transceiver, and microcontroller as well as the energy source components (batteries, solar cells, thermoelectric generators (TEG), supercapacitors and DC/DC converters). The tool allows the user to plug and play different pre characterized devices as well as add user-defined devices. The goal of this simulation tool is to predict the lifetime of a device and scope for extension using ambient energy sources.

Keywords: Wireless Sensor Network, IoT, edge device, simulation, solar cells, TEG, supercapacitor, energy harvesting

Procedia PDF Downloads 106
91 Solving One of the Variants of Necktie Paradox for Business Proposals

Authors: Natarajan Vijayarangan, Viswanath Kumar Ganesan, G. Kumudhavalli

Abstract:

This abstract figures out an uncertainty problem pertaining to evaluating business proposals or concept notes in an organisation. Let us consider business proposal evaluation process (BPEP) for execution of corporate research cum business projects in the organisation. Assume that two concept notes X and Y of BPEP are approved: one of them is a full-fledged type (100% financial approval given by the organisation) - X and other one is a conditional type (a partial financial approval given by the organisation) - Y. Then a penalty criteria has been introduced during the process. At the end of annual appraisal, if both of them complete as per the goals and objectives committed or figured out at the time of concept note submission, then both will get an incentive of $N from the organisation. If one of them doesn't fulfill the goals and objectives at the year-end appraisal, then d% reduction or cut will be levied on the project budget for the next year. If X fulfills the goals and objectives and Y doesn't , then X gets a gain of d% on Y's previous year budget and Y gets a loss of d% from the previous year budget for the next year. And vice-versa. Further, an incentive of $N will be given to those who gains. This process is a part of Necktie paradox and inherits an uncertainty principle on X or Y getting more than $N even if X or Y performs well.Solving the above problem and generalizing on finitely many concept notes will be a challenging task.

Keywords: concept notes, necktie paradox, annual appraisal, project budget and gain or loss

Procedia PDF Downloads 438