Search results for: 4/2 stochastic volatility model
16927 Evaluating Oman's Green Transition: A Dynamic Stochastic General Equilibrium Analysis of Climate Policy Effects
Authors: Mohamed Chakroun
Abstract:
In this paper, we analyze the macroeconomic impacts of Oman’s strategy to transition to a green economy by 2050. Our objective is to determine the most effective climate policy instrument to facilitate this transition. By utilizing a Dynamic Stochastic General Equilibrium (DSGE) model, we assess the effectiveness of three climate policy tools: a carbon tax, subsidies to green assets, and taxes on brown assets. Our results indicate that a combination of a carbon tax, along with differentiated taxes and subsidies on green and brown assets, proves to the most effective policy in reducing emissions while maintaining macroeconomic stability. The findings of this study demonstrate the need for policymakers to balance the immediate goals of reducing emissions with the economic costs involved. Implementing a gradual transition strategy may be preferable as it allows for mitigating the negative economic impacts while facilitating the shift towards a green economy.Keywords: green economy, carbon tax, DSGE model, climate policy, sustainable growth
Procedia PDF Downloads 2616926 Model of Optimal Centroids Approach for Multivariate Data Classification
Authors: Pham Van Nha, Le Cam Binh
Abstract:
Particle swarm optimization (PSO) is a population-based stochastic optimization algorithm. PSO was inspired by the natural behavior of birds and fish in migration and foraging for food. PSO is considered as a multidisciplinary optimization model that can be applied in various optimization problems. PSO’s ideas are simple and easy to understand but PSO is only applied in simple model problems. We think that in order to expand the applicability of PSO in complex problems, PSO should be described more explicitly in the form of a mathematical model. In this paper, we represent PSO in a mathematical model and apply in the multivariate data classification. First, PSOs general mathematical model (MPSO) is analyzed as a universal optimization model. Then, Model of Optimal Centroids (MOC) is proposed for the multivariate data classification. Experiments were conducted on some benchmark data sets to prove the effectiveness of MOC compared with several proposed schemes.Keywords: analysis of optimization, artificial intelligence based optimization, optimization for learning and data analysis, global optimization
Procedia PDF Downloads 20816925 Heuristic Methods for the Capacitated Location- Allocation Problem with Stochastic Demand
Authors: Salinee Thumronglaohapun
Abstract:
The proper number and appropriate locations of service centers can save cost, raise revenue and gain more satisfaction from customers. Establishing service centers is high-cost and difficult to relocate. In long-term planning periods, several factors may affect the service. One of the most critical factors is uncertain demand of customers. The opened service centers need to be capable of serving customers and making a profit although the demand in each period is changed. In this work, the capacitated location-allocation problem with stochastic demand is considered. A mathematical model is formulated to determine suitable locations of service centers and their allocation to maximize total profit for multiple planning periods. Two heuristic methods, a local search and genetic algorithm, are used to solve this problem. For the local search, five different chances to choose each type of moves are applied. For the genetic algorithm, three different replacement strategies are considered. The results of applying each method to solve numerical examples are compared. Both methods reach to the same best found solution in most examples but the genetic algorithm provides better solutions in some cases.Keywords: location-allocation problem, stochastic demand, local search, genetic algorithm
Procedia PDF Downloads 12416924 Development of Earthquake and Typhoon Loss Models for Japan, Specifically Designed for Underwriting and Enterprise Risk Management Cycles
Authors: Nozar Kishi, Babak Kamrani, Filmon Habte
Abstract:
Natural hazards such as earthquakes and tropical storms, are very frequent and highly destructive in Japan. Japan experiences, every year on average, more than 10 tropical cyclones that come within damaging reach, and earthquakes of moment magnitude 6 or greater. We have developed stochastic catastrophe models to address the risk associated with the entire suite of damaging events in Japan, for use by insurance, reinsurance, NGOs and governmental institutions. KCC’s (Karen Clark and Company) catastrophe models are procedures constituted of four modular segments: 1) stochastic events sets that would represent the statistics of the past events, hazard attenuation functions that could model the local intensity, vulnerability functions that would address the repair need for local buildings exposed to the hazard, and financial module addressing policy conditions that could estimates the losses incurring as result of. The events module is comprised of events (faults or tracks) with different intensities with corresponding probabilities. They are based on the same statistics as observed through the historical catalog. The hazard module delivers the hazard intensity (ground motion or wind speed) at location of each building. The vulnerability module provides library of damage functions that would relate the hazard intensity to repair need as percentage of the replacement value. The financial module reports the expected loss, given the payoff policies and regulations. We have divided Japan into regions with similar typhoon climatology, and earthquake micro-zones, within each the characteristics of events are similar enough for stochastic modeling. For each region, then, a set of stochastic events is developed that results in events with intensities corresponding to annual occurrence probabilities that are of interest to financial communities; such as 0.01, 0.004, etc. The intensities, corresponding to these probabilities (called CE, Characteristics Events) are selected through a superstratified sampling approach that is based on the primary uncertainty. Region specific hazard intensity attenuation functions followed by vulnerability models leads to estimation of repair costs. Extensive economic exposure model addresses all local construction and occupancy types, such as post-linter Shinand Okabe wood, as well as concrete confined in steel, SRC (Steel-Reinforced Concrete), high-rise.Keywords: typhoon, earthquake, Japan, catastrophe modelling, stochastic modeling, stratified sampling, loss model, ERM
Procedia PDF Downloads 26916923 Application of Stochastic Models on the Portuguese Population and Distortion to Workers Compensation Pensioners Experience
Authors: Nkwenti Mbelli Njah
Abstract:
This research was motivated by a project requested by AXA on the topic of pensions payable under the workers compensation (WC) line of business. There are two types of pensions: the compulsorily recoverable and the not compulsorily recoverable. A pension is compulsorily recoverable for a victim when there is less than 30% of disability and the pension amount per year is less than six times the minimal national salary. The law defines that the mathematical provisions for compulsory recoverable pensions must be calculated by applying the following bases: mortality table TD88/90 and rate of interest 5.25% (maybe with rate of management). To manage pensions which are not compulsorily recoverable is a more complex task because technical bases are not defined by law and much more complex computations are required. In particular, companies have to predict the amount of payments discounted reflecting the mortality effect for all pensioners (this task is monitored monthly in AXA). The purpose of this research was thus to develop a stochastic model for the future mortality of the worker’s compensation pensioners of both the Portuguese market workers and AXA portfolio. Not only is past mortality modeled, also projections about future mortality are made for the general population of Portugal as well as for the two portfolios mentioned earlier. The global model was split in two parts: a stochastic model for population mortality which allows for forecasts, combined with a point estimate from a portfolio mortality model obtained through three different relational models (Cox Proportional, Brass Linear and Workgroup PLT). The one-year death probabilities for ages 0-110 for the period 2013-2113 are obtained for the general population and the portfolios. These probabilities are used to compute different life table functions as well as the not compulsorily recoverable reserves for each of the models required for the pensioners, their spouses and children under 21. The results obtained are compared with the not compulsory recoverable reserves computed using the static mortality table (TD 73/77) that is currently being used by AXA, to see the impact on this reserve if AXA adopted the dynamic tables.Keywords: compulsorily recoverable, life table functions, relational models, worker’s compensation pensioners
Procedia PDF Downloads 16416922 Regularization of Gene Regulatory Networks Perturbed by White Noise
Authors: Ramazan I. Kadiev, Arcady Ponosov
Abstract:
Mathematical models of gene regulatory networks can in many cases be described by ordinary differential equations with switching nonlinearities, where the initial value problem is ill-posed. Several regularization methods are known in the case of deterministic networks, but the presence of stochastic noise leads to several technical difficulties. In the presentation, it is proposed to apply the methods of the stochastic singular perturbation theory going back to Yu. Kabanov and Yu. Pergamentshchikov. This approach is used to regularize the above ill-posed problem, which, e.g., makes it possible to design stable numerical schemes. Several examples are provided in the presentation, which support the efficiency of the suggested analysis. The method can also be of interest in other fields of biomathematics, where differential equations contain switchings, e.g., in neural field models.Keywords: ill-posed problems, singular perturbation analysis, stochastic differential equations, switching nonlinearities
Procedia PDF Downloads 19416921 Determination Optimum Strike Price of FX Option Call Spread with USD/IDR Volatility and Garman–Kohlhagen Model Analysis
Authors: Bangkit Adhi Nugraha, Bambang Suripto
Abstract:
On September 2016 Bank Indonesia (BI) release regulation no.18/18/PBI/2016 that permit bank clients for using the FX option call spread USD/IDR. Basically, this product is a combination between clients buy FX call option (pay premium) and sell FX call option (receive premium) to protect against currency depreciation while also capping the potential upside with cheap premium cost. BI classifies this product as a structured product. The structured product is combination at least two financial instruments, either derivative or non-derivative instruments. The call spread is the first structured product against IDR permitted by BI since 2009 as response the demand increase from Indonesia firms on FX hedging through derivative for protecting market risk their foreign currency asset or liability. The composition of hedging products on Indonesian FX market increase from 35% on 2015 to 40% on 2016, the majority on swap product (FX forward, FX swap, cross currency swap). Swap is formulated by interest rate difference of the two currency pairs. The cost of swap product is 7% for USD/IDR with one year USD/IDR volatility 13%. That cost level makes swap products seem expensive for hedging buyers. Because call spread cost (around 1.5-3%) cheaper than swap, the most Indonesian firms are using NDF FX call spread USD/IDR on offshore with outstanding amount around 10 billion USD. The cheaper cost of call spread is the main advantage for hedging buyers. The problem arises because BI regulation requires the call spread buyer doing the dynamic hedging. That means, if call spread buyer choose strike price 1 and strike price 2 and volatility USD/IDR exchange rate surpass strike price 2, then the call spread buyer must buy another call spread with strike price 1’ (strike price 1’ = strike price 2) and strike price 2’ (strike price 2’ > strike price 1‘). It could make the premium cost of call spread doubled or even more and dismiss the purpose of hedging buyer to find the cheapest hedging cost. It is very crucial for the buyer to choose best optimum strike price before entering into the transaction. To help hedging buyer find the optimum strike price and avoid expensive multiple premium cost, we observe ten years 2005-2015 historical data of USD/IDR volatility to be compared with the price movement of the call spread USD/IDR using Garman–Kohlhagen Model (as a common formula on FX option pricing). We use statistical tools to analysis data correlation, understand nature of call spread price movement over ten years, and determine factors affecting price movement. We select some range of strike price and tenor and calculate the probability of dynamic hedging to occur and how much it’s cost. We found USD/IDR currency pairs is too uncertain and make dynamic hedging riskier and more expensive. We validated this result using one year data and shown small RMS. The study result could be used to understand nature of FX call spread and determine optimum strike price for hedging plan.Keywords: FX call spread USD/IDR, USD/IDR volatility statistical analysis, Garman–Kohlhagen Model on FX Option USD/IDR, Bank Indonesia Regulation no.18/18/PBI/2016
Procedia PDF Downloads 37916920 Investigating the UAE Residential Valuation System: A Framework for Analysis
Authors: Simon Huston, Ebraheim Lahbash, Ali Parsa
Abstract:
The development of the United Arab Emirates (UAE) into a regional trade, tourism, finance and logistics hub has transformed its real estate markets. However, speculative activity and price volatility remain concerns. UAE residential market values (MV) are exposed to fluctuations in capital flows and migration which in turn are affected by geopolitical uncertainty, oil price volatility, and global investment market sentiment. Internally, a complex interplay between administrative boundaries, land tenure, building quality and evolving location characteristics fragments UAE residential property markets. In short, the UAE Residential Valuation System (UAE-RVS) confronts multiple challenges to collect, filter and analyze relevant information in complex and dynamic spatial and capital markets. A robust (RVS) can mitigate the risk of unhelpful volatility, speculative excess or investment mistakes. The research outlines the institutional, ontological, dynamic, and epistemological issues at play. We highlight the importance of system capabilities, valuation standard salience and stakeholders trust.Keywords: valuation, property rights, information, institutions, trust, salience
Procedia PDF Downloads 37916919 Scheduling Jobs with Stochastic Processing Times or Due Dates on a Server to Minimize the Number of Tardy Jobs
Authors: H. M. Soroush
Abstract:
The problem of scheduling products and services for on-time deliveries is of paramount importance in today’s competitive environments. It arises in many manufacturing and service organizations where it is desirable to complete jobs (products or services) with different weights (penalties) on or before their due dates. In such environments, schedules should frequently decide whether to schedule a job based on its processing time, due-date, and the penalty for tardy delivery to improve the system performance. For example, it is common to measure the weighted number of late jobs or the percentage of on-time shipments to evaluate the performance of a semiconductor production facility or an automobile assembly line. In this paper, we address the problem of scheduling a set of jobs on a server where processing times or due-dates of jobs are random variables and fixed weights (penalties) are imposed on the jobs’ late deliveries. The goal is to find the schedule that minimizes the expected weighted number of tardy jobs. The problem is NP-hard to solve; however, we explore three scenarios of the problem wherein: (i) both processing times and due-dates are stochastic; (ii) processing times are stochastic and due-dates are deterministic; and (iii) processing times are deterministic and due-dates are stochastic. We prove that special cases of these scenarios are solvable optimally in polynomial time, and introduce efficient heuristic methods for the general cases. Our computational results show that the heuristics perform well in yielding either optimal or near optimal sequences. The results also demonstrate that the stochasticity of processing times or due-dates can affect scheduling decisions. Moreover, the proposed problem is general in the sense that its special cases reduce to some new and some classical stochastic single machine models.Keywords: number of late jobs, scheduling, single server, stochastic
Procedia PDF Downloads 49716918 Low Cost Inertial Sensors Modeling Using Allan Variance
Authors: A. A. Hussen, I. N. Jleta
Abstract:
Micro-electromechanical system (MEMS) accelerometers and gyroscopes are suitable for the inertial navigation system (INS) of many applications due to the low price, small dimensions and light weight. The main disadvantage in a comparison with classic sensors is a worse long term stability. The estimation accuracy is mostly affected by the time-dependent growth of inertial sensor errors, especially the stochastic errors. In order to eliminate negative effect of these random errors, they must be accurately modeled. Where the key is the successful implementation that depends on how well the noise statistics of the inertial sensors is selected. In this paper, the Allan variance technique will be used in modeling the stochastic errors of the inertial sensors. By performing a simple operation on the entire length of data, a characteristic curve is obtained whose inspection provides a systematic characterization of various random errors contained in the inertial-sensor output data.Keywords: Allan variance, accelerometer, gyroscope, stochastic errors
Procedia PDF Downloads 44216917 An Integration of Genetic Algorithm and Particle Swarm Optimization to Forecast Transport Energy Demand
Authors: N. R. Badurally Adam, S. R. Monebhurrun, M. Z. Dauhoo, A. Khoodaruth
Abstract:
Transport energy demand is vital for the economic growth of any country. Globalisation and better standard of living plays an important role in transport energy demand. Recently, transport energy demand in Mauritius has increased significantly, thus leading to an abuse of natural resources and thereby contributing to global warming. Forecasting the transport energy demand is therefore important for controlling and managing the demand. In this paper, we develop a model to predict the transport energy demand. The model developed is based on a system of five stochastic differential equations (SDEs) consisting of five endogenous variables: fuel price, population, gross domestic product (GDP), number of vehicles and transport energy demand and three exogenous parameters: crude birth rate, crude death rate and labour force. An interval of seven years is used to avoid any falsification of result since Mauritius is a developing country. Data available for Mauritius from year 2003 up to 2009 are used to obtain the values of design variables by applying genetic algorithm. The model is verified and validated for 2010 to 2012 by substituting the values of coefficients obtained by GA in the model and using particle swarm optimisation (PSO) to predict the values of the exogenous parameters. This model will help to control the transport energy demand in Mauritius which will in turn foster Mauritius towards a pollution-free country and decrease our dependence on fossil fuels.Keywords: genetic algorithm, modeling, particle swarm optimization, stochastic differential equations, transport energy demand
Procedia PDF Downloads 36916916 Optimization of Platinum Utilization by Using Stochastic Modeling of Carbon-Supported Platinum Catalyst Layer of Proton Exchange Membrane Fuel Cells
Authors: Ali Akbar, Seungho Shin, Sukkee Um
Abstract:
The composition of catalyst layers (CLs) plays an important role in the overall performance and cost of the proton exchange membrane fuel cells (PEMFCs). Low platinum loading, high utilization, and more durable catalyst still remain as critical challenges for PEMFCs. In this study, a three-dimensional material network model is developed to visualize the nanostructure of carbon supported platinum Pt/C and Pt/VACNT catalysts in pursuance of maximizing the catalyst utilization. The quadruple-phase randomly generated CLs domain is formulated using quasi-random stochastic Monte Carlo-based method. This unique statistical approach of four-phase (i.e., pore, ionomer, carbon, and platinum) model is closely mimic of manufacturing process of CLs. Various CLs compositions are simulated to elucidate the effect of electrons, ions, and mass transport paths on the catalyst utilization factor. Based on simulation results, the effect of key factors such as porosity, ionomer contents and Pt weight percentage in Pt/C catalyst have been investigated at the represented elementary volume (REV) scale. The results show that the relationship between ionomer content and Pt utilization is in good agreement with existing experimental calculations. Furthermore, this model is implemented on the state-of-the-art Pt/VACNT CLs. The simulation results on Pt/VACNT based CLs show exceptionally high catalyst utilization as compared to Pt/C with different composition ratios. More importantly, this study reveals that the maximum catalyst utilization depends on the distance spacing between the carbon nanotubes for Pt/VACNT. The current simulation results are expected to be utilized in the optimization of nano-structural construction and composition of Pt/C and Pt/VACNT CLs.Keywords: catalyst layer, platinum utilization, proton exchange membrane fuel cell, stochastic modeling
Procedia PDF Downloads 12116915 Heat Waves Effect on Stock Return and Volatility: Evidence from Stock Market and Selected Industries in Pakistan
Authors: Sayed Kifayat Shah, Tang Zhongjun, Arfa Tanveer
Abstract:
This study explores the significant heatwave effect on stock return and volatility. Using an ARCH/GARCH approach, it examines the relationship between the heatwave of Karachi, Islamabad, and Lahore on the KSE-100 index. It also explores the impact of heatwave on returns of the pharmaceutical and electronics industries. The empirical results confirm that that stock return is positively related to the heat waves of Karachi, negatively related to that of Islamabad, and is not affected by the heatwave of Lahore. Similarly, pharmaceutical and electronics indices are also positively related to heatwaves. These differences in results can be ascribed to the change in the behavior of the residents of that city. The outcomes are useful for understanding an investor's behavior reacting to weather and fluxes in stock price related to heatwave severity levels. The results can support investors in fixing biases in behavior.Keywords: ARCH/GARCH model, heat wave, KSE-100 index, stock market return
Procedia PDF Downloads 15616914 Recent Volatility in Islamic Banking Sector of Bangladesh: Nexus Between Economy, Religion and Politics
Authors: Abdul Kader
Abstract:
This paper attempts to investigate several contributory factors to recent volatility in the Islamic Banking sector of Bangladesh. In particular, the study explores corporate governance, credit management, credit regulations, inept board of directors, using religious sentiment as a means to deceive general people, and the degree of political interference as potential contributory factors. To find the correlation among different variables, semi-structured questionnaires were distributed among the clients, bank managers, some Banking scholars and ex-members of the board of directors of three Islamic Banks in Bangladesh. Later, ten interviews were collected from key informants to gain in-depth information about the present mismanagement of Islamic Banks in Bangladesh. After then, data were analyzed using statistical software and substantiated by secondary sources like newspapers, reports and investigative reports aired in screen media. The paper found a correlation between almost all contributory factors and recent unstable conditions in the Islamic banking sector. After performing regression analysis, this paper found a more significant relationship between some of the contributory factors with Banking volatility than others. For instance, credit management, inept board of directors, depriving customers of proving no profit in the name of business—no interest-- and political interference have a strong significant positive correlation with the present poor condition of Islamic Banking. This paper concludes that while internal management is important in recovering the losses, the government needs to ensure framing better policy for the Islamic Banking system, Central Bank needs to supervise and monitor all Islamic banks meticulously and loan receivers must go through the impartial evaluation and approved by the representatives of the Central Shariah Board. This paper also recommends that there is a need to strengthen the auditing system and improve regulatory oversight of the Islamic Banks in Bangladesh. Policy recommendations that this paper put forward could provide an outline for dealing with the existing challenging condition of Islamic Banks and these could be applied to similar problems in other countries where the Islamic Banking model exists.Keywords: Islamic bank, volatility in banking sector, shariah law, credit management, political interference
Procedia PDF Downloads 7816913 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation-Based Approach
Authors: Sujoy Das, M. M. Ghosh
Abstract:
The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solid-solid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulse-like pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.Keywords: brownian dynamics, molecular dynamics, nanofluid, thermal conductivity
Procedia PDF Downloads 37116912 Heteroscedastic Parametric and Semiparametric Smooth Coefficient Stochastic Frontier Application to Technical Efficiency Measurement
Authors: Rebecca Owusu Coffie, Atakelty Hailu
Abstract:
Variants of production frontier models have emerged, however, only a limited number of them are applied in empirical research. Hence the effects of these alternative frontier models are not well understood, particularly within sub-Saharan Africa. In this paper, we apply recent advances in the production frontier to examine levels of technical efficiency and efficiency drivers. Specifically, we compare the heteroscedastic parametric and the semiparametric stochastic smooth coefficient (SPSC) models. Using rice production data from Ghana, our empirical estimates reveal that alternative specification of efficiency estimators results in either downward or upward bias in the technical efficiency estimates. Methodologically, we find that the SPSC model is more suitable and generates high-efficiency estimates. Within the parametric framework, we find that parameterization of both the mean and variance of the pre-truncated function is the best model. For the drivers of technical efficiency, we observed that longer farm distances increase inefficiency through a reduction in labor productivity. High soil quality, however, increases productivity through increased land productivity.Keywords: pre-truncated, rice production, smooth coefficient, technical efficiency
Procedia PDF Downloads 44516911 An Approach to Noise Variance Estimation in Very Low Signal-to-Noise Ratio Stochastic Signals
Authors: Miljan B. Petrović, Dušan B. Petrović, Goran S. Nikolić
Abstract:
This paper describes a method for AWGN (Additive White Gaussian Noise) variance estimation in noisy stochastic signals, referred to as Multiplicative-Noising Variance Estimation (MNVE). The aim was to develop an estimation algorithm with minimal number of assumptions on the original signal structure. The provided MATLAB simulation and results analysis of the method applied on speech signals showed more accuracy than standardized AR (autoregressive) modeling noise estimation technique. In addition, great performance was observed on very low signal-to-noise ratios, which in general represents the worst case scenario for signal denoising methods. High execution time appears to be the only disadvantage of MNVE. After close examination of all the observed features of the proposed algorithm, it was concluded it is worth of exploring and that with some further adjustments and improvements can be enviably powerful.Keywords: noise, signal-to-noise ratio, stochastic signals, variance estimation
Procedia PDF Downloads 38616910 Valuation of Caps and Floors in a LIBOR Market Model with Markov Jump Risks
Authors: Shih-Kuei Lin
Abstract:
The characterization of the arbitrage-free dynamics of interest rates is developed in this study under the presence of Markov jump risks, when the term structure of the interest rates is modeled through simple forward rates. We consider Markov jump risks by allowing randomness in jump sizes, independence between jump sizes and jump times. The Markov jump diffusion model is used to capture empirical phenomena and to accurately describe interest jump risks in a financial market. We derive the arbitrage-free model of simple forward rates under the spot measure. Moreover, the analytical pricing formulas for a cap and a floor are derived under the forward measure when the jump size follows a lognormal distribution. In our empirical analysis, we find that the LIBOR market model with Markov jump risk better accounts for changes from/to different states and different rates.Keywords: arbitrage-free, cap and floor, Markov jump diffusion model, simple forward rate model, volatility smile, EM algorithm
Procedia PDF Downloads 42116909 An Accelerated Stochastic Gradient Method with Momentum
Authors: Liang Liu, Xiaopeng Luo
Abstract:
In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum
Procedia PDF Downloads 16216908 Synthesis of Filtering in Stochastic Systems on Continuous-Time Memory Observations in the Presence of Anomalous Noises
Authors: S. Rozhkova, O. Rozhkova, A. Harlova, V. Lasukov
Abstract:
We have conducted the optimal synthesis of root-mean-squared objective filter to estimate the state vector in the case if within the observation channel with memory the anomalous noises with unknown mathematical expectation are complement in the function of the regular noises. The synthesis has been carried out for linear stochastic systems of continuous-time.Keywords: mathematical expectation, filtration, anomalous noise, memory
Procedia PDF Downloads 24716907 Modelling Volatility of Cryptocurrencies: Evidence from GARCH Family of Models with Skewed Error Innovation Distributions
Authors: Timothy Kayode Samson, Adedoyin Isola Lawal
Abstract:
The past five years have shown a sharp increase in public interest in the crypto market, with its market capitalization growing from $100 billion in June 2017 to $2158.42 billion on April 5, 2022. Despite the outrageous nature of the volatility of cryptocurrencies, the use of skewed error innovation distributions in modelling the volatility behaviour of these digital currencies has not been given much research attention. Hence, this study models the volatility of 5 largest cryptocurrencies by market capitalization (Bitcoin, Ethereum, Tether, Binance coin, and USD Coin) using four variants of GARCH models (GJR-GARCH, sGARCH, EGARCH, and APARCH) estimated using three skewed error innovation distributions (skewed normal, skewed student- t and skewed generalized error innovation distributions). Daily closing prices of these currencies were obtained from Yahoo Finance website. Finding reveals that the Binance coin reported higher mean returns compared to other digital currencies, while the skewness indicates that the Binance coin, Tether, and USD coin increased more than they decreased in values within the period of study. For both Bitcoin and Ethereum, negative skewness was obtained, meaning that within the period of study, the returns of these currencies decreased more than they increased in value. Returns from these cryptocurrencies were found to be stationary but not normality distributed with evidence of the ARCH effect. The skewness parameters in all best forecasting models were all significant (p<.05), justifying of use of skewed error innovation distributions with a fatter tail than normal, Student-t, and generalized error innovation distributions. For Binance coin, EGARCH-sstd outperformed other volatility models, while for Bitcoin, Ethereum, Tether, and USD coin, the best forecasting models were EGARCH-sstd, APARCH-sstd, EGARCH-sged, and GJR-GARCH-sstd, respectively. This suggests the superiority of skewed Student t- distribution and skewed generalized error distribution over the skewed normal distribution.Keywords: skewed generalized error distribution, skewed normal distribution, skewed student t- distribution, APARCH, EGARCH, sGARCH, GJR-GARCH
Procedia PDF Downloads 11916906 Determining the Effects of Wind-Aided Midge Movement on the Probability of Coexistence of Multiple Bluetongue Virus Serotypes in Patchy Environments
Authors: Francis Mugabi, Kevin Duffy, Joseph J. Y. T Mugisha, Obiora Collins
Abstract:
Bluetongue virus (BTV) has 27 serotypes, with some of them coexisting in patchy (different) environments, which make its control difficult. Wind-aided midge movement is a known mechanism in the spread of BTV. However, its effects on the probability of coexistence of multiple BTV serotypes are not clear. Deterministic and stochastic models for r BTV serotypes in n discrete patches connected by midge and/or cattle movement are formulated and analyzed. For the deterministic model without midge and cattle movement, using the comparison principle, it is shown that if the patch reproduction number R0 < 1, i=1,2,...,n, j=1,2,...,r, all serotypes go extinct. If R^j_i0>1, competitive exclusion takes place. Using numerical simulations, it is shown that when the n patches are connected by midge movement, coexistence takes place. To account for demographic and movement variability, the deterministic model is transformed into a continuous-time Markov chain stochastic model. Utilizing a multitype branching process, it is shown that the midge movement can have a large effect on the probability of coexistence of multiple BTV serotypes. The probability of coexistence can be brought to zero when the control interventions that directly kill the adult midges are applied. These results indicate the significance of wind-aided midge movement and vector control interventions on the coexistence and control of multiple BTV serotypes in patchy environments.Keywords: bluetongue virus, coexistence, multiple serotypes, midge movement, branching process
Procedia PDF Downloads 15016905 Parameter Estimation for Contact Tracing in Graph-Based Models
Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar
Abstract:
We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference
Procedia PDF Downloads 7716904 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling
Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra
Abstract:
Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.Keywords: multi-temporal satellite image, urban growth, non-stationary, stochastic model
Procedia PDF Downloads 42816903 An Alternative Richards’ Growth Model Based on Hyperbolic Sine Function
Authors: Samuel Oluwafemi Oyamakin, Angela Unna Chukwu
Abstract:
Richrads growth equation being a generalized logistic growth equation was improved upon by introducing an allometric parameter using the hyperbolic sine function. The integral solution to this was called hyperbolic Richards growth model having transformed the solution from deterministic to a stochastic growth model. Its ability in model prediction was compared with the classical Richards growth model an approach which mimicked the natural variability of heights/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using the coefficient of determination (R2), Mean Absolute Error (MAE) and Mean Square Error (MSE) results. The Kolmogorov-Smirnov test and Shapiro-Wilk test was also used to test the behavior of the error term for possible violations. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic Richards nonlinear growth models better than the classical Richards growth model.Keywords: height, diameter at breast height, DBH, hyperbolic sine function, Pinus caribaea, Richards' growth model
Procedia PDF Downloads 39216902 Effect of Agricultural Extension Services on Technical Efficiency of Smallholder Cassava Farmers in Ghana: A Stochastic Meta-Frontier Analysis
Authors: Arnold Missiame
Abstract:
In Ghana, rural dwellers who depend primarily on agriculture for their livelihood constitute about 60% of the country’s population. This shows the critical role and potentials of the agricultural sector in helping to achieve Ghana’s vision 2030. With the current threat of climate change and advancements in technology, agricultural extension is not just about technology transfer and improvements in productivity, but it is also about improving the managerial and technical skills of farmers. In Ghana, the government of Ghana as well as other players in the sector like; non-governmental organizations, NGOs, local and international funding agencies, for decades now, have made capacity-building-investments in smallholder farmers by way of extension services delivery. This study sought to compare the technical efficiency of farmers who have access to agricultural extension and farmers who do not in Ghana. The study employed the stochastic meta-frontier model to analyze household survey data comprising 300 smallholder cassava farmers from the Fanteakwa district of Ghana. The farmers were selected through a two-stage sampling technique where 5 communities were purposively selected in the first stage and then 60 smallholder cassava farmers were randomly selected from each of the 5 communities. Semi-structured questionnaires were used to collect data on farmers’ socioeconomic and farm-level characteristics. The results showed that farmers who have access to agricultural extensions services have higher technical efficiencies (TE) and produce much closer to their meta-production frontiers (higher technology gap ratios (TGR) than farmers who do not have access to such extension services. Furthermore, experience in cassava cultivation and formal education significantly improves the technical efficiencies of farmers. The study recommends that the mode and scope of agricultural extension service delivery in the country should be enhanced to ensure that smallholder farmers have easy access to extension agents.Keywords: agricultural extension, Ghana, smallholder farmers, stochastic meta-frontier model, technical efficiency
Procedia PDF Downloads 10816901 An Empirical Analysis of the Effects of Corporate Derivatives Use on the Underlying Stock Price Exposure: South African Evidence
Authors: Edson Vengesai
Abstract:
Derivative products have become essential instruments in portfolio diversification, price discovery, and, most importantly, risk hedging. Derivatives are complex instruments; their valuation, volatility implications, and real impact on the underlying assets' behaviour are not well understood. Little is documented empirically, with conflicting conclusions on how these instruments affect firm risk exposures. Given the growing interest in using derivatives in risk management and portfolio engineering, this study examines the practical impact of derivative usage on the underlying stock price exposure and systematic risk. The paper uses data from South African listed firms. The study employs GARCH models to understand the effect of derivative uses on conditional stock volatility. The GMM models are used to estimate the effect of derivatives use on stocks' systematic risk as measured by Beta and on the total risk of stocks as measured by the standard deviation of returns. The results provide evidence on whether derivatives use is instrumental in reducing stock returns' systematic and total risk. The results are subjected to numerous controls for robustness, including financial leverage, firm size, growth opportunities, and macroeconomic effects.Keywords: derivatives use, hedging, volatility, stock price exposure
Procedia PDF Downloads 10816900 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution
Procedia PDF Downloads 37216899 ARIMA-GARCH, A Statistical Modeling for Epileptic Seizure Prediction
Authors: Salman Mohamadi, Seyed Mohammad Ali Tayaranian Hosseini, Hamidreza Amindavar
Abstract:
In this paper, we provide a procedure to analyze and model EEG (electroencephalogram) signal as a time series using ARIMA-GARCH to predict an epileptic attack. The heteroskedasticity of EEG signal is examined through the ARCH or GARCH, (Autore- gressive conditional heteroskedasticity, Generalized autoregressive conditional heteroskedasticity) test. The best ARIMA-GARCH model in AIC sense is utilized to measure the volatility of the EEG from epileptic canine subjects, to forecast the future values of EEG. ARIMA-only model can perform prediction, but the ARCH or GARCH model acting on the residuals of ARIMA attains a con- siderable improved forecast horizon. First, we estimate the best ARIMA model, then different orders of ARCH and GARCH modelings are surveyed to determine the best heteroskedastic model of the residuals of the mentioned ARIMA. Using the simulated conditional variance of selected ARCH or GARCH model, we suggest the procedure to predict the oncoming seizures. The results indicate that GARCH modeling determines the dynamic changes of variance well before the onset of seizure. It can be inferred that the prediction capability comes from the ability of the combined ARIMA-GARCH modeling to cover the heteroskedastic nature of EEG signal changes.Keywords: epileptic seizure prediction , ARIMA, ARCH and GARCH modeling, heteroskedasticity, EEG
Procedia PDF Downloads 40616898 Frailty Patterns in the US and Implications for Long-Term Care
Authors: Joelle Fong
Abstract:
Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.Keywords: actuarial modeling, cohort analysis, frail elderly, health
Procedia PDF Downloads 244