Search results for: stochastic volatility
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 646

Search results for: stochastic volatility

406 Technical Efficiency of Small-Scale Honey Producer in Ethiopia: A Stochastic Frontier Analysis

Authors: Kaleb Shiferaw, Berhanu Geberemedhin

Abstract:

Ethiopian farmers have a long tradition of beekeeping and the country has huge potential for honey production. However traditional mode of production still dominates the sub sector which negatively affect the total production and productivity. A number of studies have been conducted to better understand the working honey production, however, none of them systematically investigate the extent of technical efficiency of the sub-sector. This paper uses Stochastic Frontier production model to quantifying the extent of technical efficiency and identify exogenous determinant of inefficiency. The result showed that consistent with other studies traditional practice dominate small scale honey production in Ethiopia. The finding also revealed that use of purchased inputs such as bee forage and other supplement is very limited among honey producers indicating that natural bee forage is the primary source of bee forage. The immediate consequence of all these is low production and productivity. The number of hives the household owns, whether the household used improved apiculture technologies, availability of natural forest which is the primary sources of nectar for bees and amount of land owned by the households were found to have a significant influence on the amount of honey produced by beekeeper. Our result further showed that the mean technical efficiency of honey producers is 0.79 implying that, on average honey producer produce 80 percent of the maximum output. The implication is that 20 percent of the potential output is lost due to technical inefficiency. Number of hives owned by a honey produces, distance to district town-a proxy to market access, household wealth, and whether the household head has a leadership role in the PA affect the technical efficiency of honey producers. The finding suggest that policies that aim to expand the use of improved hives is expected to increase the honey production at household level. The result also suggest that investment on rural infrastructure would be instrumental in improving technical efficiency of honey producer.

Keywords: small-scale honey producer, Ethiopia, technical efficiency in apiculture, stochastic frontier analysis

Procedia PDF Downloads 200
405 Parameter Estimation for Contact Tracing in Graph-Based Models

Authors: Augustine Okolie, Johannes Müller, Mirjam Kretzchmar

Abstract:

We adopt a maximum-likelihood framework to estimate parameters of a stochastic susceptible-infected-recovered (SIR) model with contact tracing on a rooted random tree. Given the number of detectees per index case, our estimator allows to determine the degree distribution of the random tree as well as the tracing probability. Since we do not discover all infectees via contact tracing, this estimation is non-trivial. To keep things simple and stable, we develop an approximation suited for realistic situations (contract tracing probability small, or the probability for the detection of index cases small). In this approximation, the only epidemiological parameter entering the estimator is the basic reproduction number R0. The estimator is tested in a simulation study and applied to covid-19 contact tracing data from India. The simulation study underlines the efficiency of the method. For the empirical covid-19 data, we are able to compare different degree distributions and perform a sensitivity analysis. We find that particularly a power-law and a negative binomial degree distribution meet the data well and that the tracing probability is rather large. The sensitivity analysis shows no strong dependency on the reproduction number.

Keywords: stochastic SIR model on graph, contact tracing, branching process, parameter inference

Procedia PDF Downloads 43
404 A Stochastic Vehicle Routing Problem with Ordered Customers and Collection of Two Similar Products

Authors: Epaminondas G. Kyriakidis, Theodosis D. Dimitrakos, Constantinos C. Karamatsoukis

Abstract:

The vehicle routing problem (VRP) is a well-known problem in Operations Research and has been widely studied during the last fifty-five years. The context of the VRP is that of delivering or collecting products to or from customers who are scattered in a geographical area and have placed orders for these products. A vehicle or a fleet of vehicles start their routes from a depot and visit the customers in order to satisfy their demands. Special attention has been given to the capacitated VRP in which the vehicles have limited carrying capacity for the goods that are delivered or collected. In the present work, we present a specific capacitated stochastic vehicle routing problem which has many realistic applications. We develop and analyze a mathematical model for a specific vehicle routing problem in which a vehicle starts its route from a depot and visits N customers according to a particular sequence in order to collect from them two similar but not identical products. We name these products, product 1 and product 2. Each customer possesses items either of product 1 or product 2 with known probabilities. The number of the items of product 1 or product 2 that each customer possesses is a discrete random variable with known distribution. The actual quantity and the actual type of product that each customer possesses are revealed only when the vehicle arrives at the customer’s site. It is assumed that the vehicle has two compartments. We name these compartments, compartment 1 and compartment 2. It is assumed that compartment 1 is suitable for loading product 1 and compartment 2 is suitable for loading product 2. However, it is permitted to load items of product 1 into compartment 2 and items of product 2 into compartment 1. These actions cause costs that are due to extra labor. The vehicle is allowed during its route to return to the depot to unload the items of both products. The travel costs between consecutive customers and the travel costs between the customers and the depot are known. The objective is to find the optimal routing strategy, i.e. the routing strategy that minimizes the total expected cost among all possible strategies for servicing all customers. It is possible to develop a suitable dynamic programming algorithm for the determination of the optimal routing strategy. It is also possible to prove that the optimal routing strategy has a specific threshold-type strategy. Specifically, it is shown that for each customer the optimal actions are characterized by some critical integers. This structural result enables us to design a special-purpose dynamic programming algorithm that operates only over these strategies having this structural property. Extensive numerical results provide strong evidence that the special-purpose dynamic programming algorithm is considerably more efficient than the initial dynamic programming algorithm. Furthermore, if we consider the same problem without the assumption that the customers are ordered, numerical experiments indicate that the optimal routing strategy can be computed if N is smaller or equal to eight.

Keywords: dynamic programming, similar products, stochastic demands, stochastic preferences, vehicle routing problem

Procedia PDF Downloads 218
403 Coupling Random Demand and Route Selection in the Transportation Network Design Problem

Authors: Shabnam Najafi, Metin Turkay

Abstract:

Network design problem (NDP) is used to determine the set of optimal values for certain pre-specified decision variables such as capacity expansion of nodes and links by optimizing various system performance measures including safety, congestion, and accessibility. The designed transportation network should improve objective functions defined for the system by considering the route choice behaviors of network users at the same time. The NDP studies mostly investigated the random demand and route selection constraints separately due to computational challenges. In this work, we consider both random demand and route selection constraints simultaneously. This work presents a nonlinear stochastic model for land use and road network design problem to address the development of different functional zones in urban areas by considering both cost function and air pollution. This model minimizes cost function and air pollution simultaneously with random demand and stochastic route selection constraint that aims to optimize network performance via road capacity expansion. The Bureau of Public Roads (BPR) link impedance function is used to determine the travel time function in each link. We consider a city with origin and destination nodes which can be residential or employment or both. There are set of existing paths between origin-destination (O-D) pairs. Case of increasing employed population is analyzed to determine amount of roads and origin zones simultaneously. Minimizing travel and expansion cost of routes and origin zones in one side and minimizing CO emission in the other side is considered in this analysis at the same time. In this work demand between O-D pairs is random and also the network flow pattern is subject to stochastic user equilibrium, specifically logit route choice model. Considering both demand and route choice, random is more applicable to design urban network programs. Epsilon-constraint is one of the methods to solve both linear and nonlinear multi-objective problems. In this work epsilon-constraint method is used to solve the problem. The problem was solved by keeping first objective (cost function) as the objective function of the problem and second objective as a constraint that should be less than an epsilon, where epsilon is an upper bound of the emission function. The value of epsilon should change from the worst to the best value of the emission function to generate the family of solutions representing Pareto set. A numerical example with 2 origin zones and 2 destination zones and 7 links is solved by GAMS and the set of Pareto points is obtained. There are 15 efficient solutions. According to these solutions as cost function value increases, emission function value decreases and vice versa.

Keywords: epsilon-constraint, multi-objective, network design, stochastic

Procedia PDF Downloads 610
402 A Regional Analysis on Co-movement of Sovereign Credit Risk and Interbank Risks

Authors: Mehdi Janbaz

Abstract:

The global financial crisis and the credit crunch that followed magnified the importance of credit risk management and its crucial role in the stability of all financial sectors and the whole of the system. Many believe that risks faced by the sovereign sector are highly interconnected with banking risks and most likely to trigger and reinforce each other. This study aims to examine (1) the impact of banking and interbank risk factors on the sovereign credit risk of Eurozone, and (2) how the EU Credit Default Swaps spreads dynamics are affected by the Crude Oil price fluctuations. The hypothesizes are tested by employing fitting risk measures and through a four-staged linear modeling approach. The sovereign senior 5-year Credit Default Swap spreads are used as a core measure of the credit risk. The monthly time-series data of the variables used in the study are gathered from the DataStream database for a period of 2008-2019. First, a linear model test the impact of regional macroeconomic and market-based factors (STOXX, VSTOXX, Oil, Sovereign Debt, and Slope) on the CDS spreads dynamics. Second, the bank-specific factors, including LIBOR-OIS spread (the difference between the Euro 3-month LIBOR rate and Euro 3-month overnight index swap rates) and Euribor, are added to the most significant factors of the previous model. Third, the global financial factors including EURO to USD Foreign Exchange Volatility, TED spread (the difference between 3-month T-bill and the 3-month LIBOR rate based in US dollars), and Chicago Board Options Exchange (CBOE) Crude Oil Volatility Index are added to the major significant factors of the first two models. Finally, a model is generated by a combination of the major factor of each variable set in addition to the crisis dummy. The findings show that (1) the explanatory power of LIBOR-OIS on the sovereign CDS spread of Eurozone is very significant, and (2) there is a meaningful adverse co-movement between the Crude Oil price and CDS price of Eurozone. Surprisingly, adding TED spread (the difference between the three-month Treasury bill and the three-month LIBOR based in US dollars.) to the analysis and beside the LIBOR-OIS spread (the difference between the Euro 3M LIBOR and Euro 3M OIS) in third and fourth models has been increased the predicting power of LIBOR-OIS. Based on the results, LIBOR-OIS, Stoxx, TED spread, Slope, Oil price, OVX, FX volatility, and Euribor are the determinants of CDS spreads dynamics in Eurozone. Moreover, the positive impact of the crisis period on the creditworthiness of the Eurozone is meaningful.

Keywords: CDS, crude oil, interbank risk, LIBOR-OIS, OVX, sovereign credit risk, TED

Procedia PDF Downloads 109
401 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river

Procedia PDF Downloads 263
400 Gaits Stability Analysis for a Pneumatic Quadruped Robot Using Reinforcement Learning

Authors: Soofiyan Atar, Adil Shaikh, Sahil Rajpurkar, Pragnesh Bhalala, Aniket Desai, Irfan Siddavatam

Abstract:

Deep reinforcement learning (deep RL) algorithms leverage the symbolic power of complex controllers by automating it by mapping sensory inputs to low-level actions. Deep RL eliminates the complex robot dynamics with minimal engineering. Deep RL provides high-risk involvement by directly implementing it in real-world scenarios and also high sensitivity towards hyperparameters. Tuning of hyperparameters on a pneumatic quadruped robot becomes very expensive through trial-and-error learning. This paper presents an automated learning control for a pneumatic quadruped robot using sample efficient deep Q learning, enabling minimal tuning and very few trials to learn the neural network. Long training hours may degrade the pneumatic cylinder due to jerk actions originated through stochastic weights. We applied this method to the pneumatic quadruped robot, which resulted in a hopping gait. In our process, we eliminated the use of a simulator and acquired a stable gait. This approach evolves so that the resultant gait matures more sturdy towards any stochastic changes in the environment. We further show that our algorithm performed very well as compared to programmed gait using robot dynamics.

Keywords: model-based reinforcement learning, gait stability, supervised learning, pneumatic quadruped

Procedia PDF Downloads 280
399 Food Security in the Middle East and North Africa

Authors: Sara D. Garduno-Diaz, Philippe Y. Garduno-Diaz

Abstract:

To date, one of the few comprehensive indicators for the measurement of food security is the Global Food Security Index. This index is a dynamic quantitative and qualitative bench marking model, constructed from 28 unique indicators, that measures drivers of food security across both developing and developed countries. Whereas the Global Food Security Index has been calculated across a set of 109 countries, in this paper we aim to present and compare, for the Middle East and North Africa (MENA), 1) the Food Security Index scores achieved and 2) the data available on affordability, availability, and quality of food. The data for this work was taken from the latest (2014) report published by the creators of the GFSI, which in turn used information from national and international statistical sources. According to the 2014 Global Food Security Index, MENA countries rank from place 17/109 (Israel, although with resent political turmoil this is likely to have changed) to place 91/109 (Yemen) with household expenditure spent in food ranging from 15.5% (Israel) to 60% (Egypt). Lower spending on food as a share of household consumption in most countries and better food safety net programs in the MENA have contributed to a notable increase in food affordability. The region has also however experienced a decline in food availability, owing to more limited food supplies and higher volatility of agricultural production. In terms of food quality and safety the MENA has the top ranking country (Israel). The most frequent challenges faced by the countries of the MENA include public expenditure on agricultural research and development as well as volatility of agricultural production. Food security is a complex phenomenon that interacts with many other indicators of a country’s well-being; in the MENA it is slowly but markedly improving.

Keywords: diet, food insecurity, global food security index, nutrition, sustainability

Procedia PDF Downloads 317
398 Understanding the Impact of Climate Change on Farmer's Technical Efficiency in Mali

Authors: Christelle Tchoupé Makougoum

Abstract:

In the context of agriculture, differences across localities in term of climate change can create systematic variation among farmers technical efficiency. Failure to account for climate variability could lead to wrong conclusions about farmers’ technical efficiency and also it could bias the ranking of farmers according to their managerial performance. The literature on agricultural productivity has given little attention to this issue whereas it is necessary for establishing to what extent climate affects farmers efficiency. This article contributes to the preview literature by two ways. First, it proposed a new econometric model that accounting for the climate change influences on technical efficiency in the specific area of agriculture. Second it estimates the inefficiency due to climate change and the real managerial performance of Malian farmers. Using the Mali’s data from agricultural census and CRU TS3 climatic database we implemented an adjusted stochastic frontier methodology to account for the impact of environmental factors. The results yield three main findings. First, instability in temperatures and rainfall decreases technical efficiency on average. Second, the climate change modifies the classification of the farmers according to their efficiency scores. Thirdly it is noted that, although climate changes are partly responsible for the deviation from the border, the capacity of farmers to combine inputs into the optimal proportion is more to undermine. The study concluded that improving farmer efficiency should include fostering their resilience to climate change.

Keywords: agriculture, climate change, stochastic production function, technical efficiency

Procedia PDF Downloads 477
397 The Fefe Indices: The Direction of Donal Trump’s Tweets Effect on the Stock Market

Authors: Sergio Andres Rojas, Julian Benavides Franco, Juan Tomas Sayago

Abstract:

An increasing amount of research demonstrates how market mood affects financial markets, but their primary goal is to demonstrate how Trump's tweets impacted US interest rate volatility. Following that lead, this work evaluates the effect that Trump's tweets had during his presidency on local and international stock markets, considering not just volatility but the direction of the movement. Three indexes for Trump's tweets were created relating his activity with movements in the S&P500 using natural language analysis and machine learning algorithms. The indexes consider Trump's tweet activity and the positive or negative market sentiment they might inspire. The first explores the relationship between tweets generating negative movements in the S&P500; the second explores positive movements, while the third explores the difference between up and down movements. A pseudo-investment strategy using the indexes produced statistically significant above-average abnormal returns. The findings also showed that the pseudo strategy generated a higher return in the local market if applied to intraday data. However, only a negative market sentiment caused this effect on daily data. These results suggest that the market reacted primarily to a negative idea reflected in the negative index. In the international market, it is not possible to identify a pervasive effect. A rolling window regression model was also performed. The result shows that the impact on the local and international markets is heterogeneous, time-changing, and differentiated for the market sentiment. However, the negative sentiment was more prone to have a significant correlation most of the time.

Keywords: market sentiment, Twitter market sentiment, machine learning, natural dialect analysis

Procedia PDF Downloads 36
396 Frailty Patterns in the US and Implications for Long-Term Care

Authors: Joelle Fong

Abstract:

Older persons are at greatest risk of becoming frail. As survival to the age of 80 and beyond continues to increase, the health and frailty of older Americans has garnered much recent attention among policy makers and healthcare administrators. This paper examines patterns in old-age frailty within a multistate actuarial model that characterizes the stochastic process of biological ageing. Using aggregate population-level U.S. mortality data, we implement a stochastic aging model to examine cohort trends and gender differences in frailty distributions for older Americans born 1865 – 1894. The stochastic ageing model, which draws from the fields of actuarial science and gerontology, is well-established in the literature. The implications for public health insurance programs are also discussed. Our results suggest that, on average, women tend to be frailer than men at older ages and reveal useful insights about the magnitude of the male-female differential at critical age points. Specifically, we note that the frailty statuses of males and females are actually quite comparable from ages 65 to 80. Beyond age 80, however, the frailty levels start to diverge considerably implying that women are moving quicker into worse states of health than men. Tracking average frailty by gender over 30 successive birth cohorts, we also find that frailty levels for both genders follow a distinct peak-and-trough pattern. For instance, frailty among 85-year old American survivors increased in years 1954-1963, decreased in years 1964-1971, and again started to increase in years 1972-1979. A number of factors may have accounted for these cohort differences including differences in cohort life histories, differences in disease prevalence, differences in lifestyle and behavior, differential access to medical advances, as well as changes in environmental risk factors over time. We conclude with a discussion on the implications of our findings on spending for long-term care programs within the broader health insurance system.

Keywords: actuarial modeling, cohort analysis, frail elderly, health

Procedia PDF Downloads 212
395 Tree-Based Inference for Regionalization: A Comparative Study of Global Topological Perturbation Methods

Authors: Orhun Aydin, Mark V. Janikas, Rodrigo Alves, Renato Assuncao

Abstract:

In this paper, a tree-based perturbation methodology for regionalization inference is presented. Regionalization is a constrained optimization problem that aims to create groups with similar attributes while satisfying spatial contiguity constraints. Similar to any constrained optimization problem, the spatial constraint may hinder convergence to some global minima, resulting in spatially contiguous members of a group with dissimilar attributes. This paper presents a general methodology for rigorously perturbing spatial constraints through the use of random spanning trees. The general framework presented can be used to quantify the effect of the spatial constraints in the overall regionalization result. We compare several types of stochastic spanning trees used in inference problems such as fuzzy regionalization and determining the number of regions. Performance of stochastic spanning trees is juxtaposed against the traditional permutation-based hypothesis testing frequently used in spatial statistics. Inference results for fuzzy regionalization and determining the number of regions is presented on the Local Area Personal Incomes for Texas Counties provided by the Bureau of Economic Analysis.

Keywords: regionalization, constrained clustering, probabilistic inference, fuzzy clustering

Procedia PDF Downloads 185
394 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression

Authors: Issam Aouari, Abdelmalek Abdelhamid

Abstract:

For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.

Keywords: duration, earthquake, prediction, regression, soft soil

Procedia PDF Downloads 116
393 Accelerated Structural Reliability Analysis under Earthquake-Induced Tsunamis by Advanced Stochastic Simulation

Authors: Sai Hung Cheung, Zhe Shao

Abstract:

Recent earthquake-induced tsunamis in Padang, 2004 and Tohoku, 2011 brought huge losses of lives and properties. Maintaining vertical evacuation systems is the most crucial strategy to effectively reduce casualty during the tsunami event. Thus, it is of our great interest to quantify the risk to structural dynamic systems due to earthquake-induced tsunamis. Despite continuous advancement in computational simulation of the tsunami and wave-structure interaction modeling, it still remains computationally challenging to evaluate the reliability (or its complement failure probability) of a structural dynamic system when uncertainties related to the system and its modeling are taken into account. The failure of the structure in a tsunami-wave-structural system is defined as any response quantities of the system exceeding specified thresholds during the time when the structure is subjected to dynamic wave impact due to earthquake-induced tsunamis. In this paper, an approach based on a novel integration of the Subset Simulation algorithm and a recently proposed moving least squares response surface approach for stochastic sampling is proposed. The effectiveness of the proposed approach is discussed by comparing its results with those obtained from the Subset Simulation algorithm without using the response surface approach.

Keywords: response surface model, subset simulation, structural reliability, Tsunami risk

Procedia PDF Downloads 337
392 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation-Based Approach

Authors: Sujoy Das, M. M. Ghosh

Abstract:

The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solid-solid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulse-like pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.

Keywords: brownian dynamics, molecular dynamics, nanofluid, thermal conductivity

Procedia PDF Downloads 349
391 The Impact of Corporate Finance on Financial Stability in the Western Balkan Countries

Authors: Luan Vardari, Dena Arapi-Vardari

Abstract:

Financial stability is a critical component of economic growth and development, and it has been recognized as a key policy objective in many countries around the world. In the Western Balkans, financial stability has been a key issue in recent years, with a number of challenges facing the region, including high levels of public debt, weak banking systems, and economic volatility. Corporate finance, which refers to the financial management practices of firms, is an important factor that can impact financial stability. This paper aims to investigate corporate finance's impact on financial stability in Western Balkan countries. This study will use a mixed-methods approach to investigate the impact of corporate finance on financial stability in the Western Balkans. The study will begin with a comprehensive review of the existing literature on corporate finance and financial stability, focusing on the Western Balkan region. This will be followed by an empirical analysis of regional corporate finance practices using data from various industries and firms. The analysis will explore the relationship between corporate finance practices and financial stability, taking into account factors such as regulatory frameworks, economic conditions, and firm size. The results of the study are expected to provide insights into the impact of corporate finance on financial stability in the Western Balkans. Specifically, the study will identify the key corporate finance practices that contribute to financial stability in the region, as well as the challenges and obstacles that firms face in implementing effective corporate finance strategies. The study will also provide recommendations for policymakers and firms looking to enhance financial stability and resilience in the region.

Keywords: financial regulation, debt management, investment decisions, dividend policies, economic volatility, banking systems, public debt, prudent financial management, firm size, policy recommendations

Procedia PDF Downloads 34
390 Stochastic Fleet Sizing and Routing in Drone Delivery

Authors: Amin Karimi, Lele Zhang, Mark Fackrell

Abstract:

Rural-to-urban population migrations are a global phenomenon, with projections indicating that by 2050, 68% of the world's population will inhabit densely populated urban centers. Concurrently, the popularity of e-commerce shopping has surged, evidenced by a 51% increase in total e-commerce sales from 2017 to 2021. Consequently, distribution and logistics systems, integral to effective supply chain management, confront escalating hurdles in efficiently delivering and distributing products within bustling urban environments. Additionally, events like environmental challenges and the COVID-19 pandemic have indicated that decision-makers are facing numerous sources of uncertainty. Therefore, to design an efficient and reliable logistics system, uncertainty must be considered. In this study, it examine fleet sizing and routing while considering uncertainty in demand rate. Fleet sizing is typically a strategic-level decision, while routing is an operational-level one. In this study, a carrier must make two types of decisions: strategic-level decisions regarding the number and types of drones to be purchased, and operational-level decisions regarding planning routes based on available fleet and realized demand. If the available fleets are insufficient to serve some customers, the carrier must outsource that delivery at a relatively high cost, calculated per order. With this hierarchy of decisions, it can model the problem using two-stage stochastic programming. The first-stage decisions involve planning the number and type of drones to be purchased, while the second-stage decisions involve planning routes. To solve this model, it employ logic-based benders decomposition, which decomposes the problem into a master problem and a set of sub-problems. The master problem becomes a mixed integer programming model to find the best fleet sizing decisions, and the sub-problems become capacitated vehicle routing problems considering battery status. Additionally, it assume a heterogeneous fleet based on load and battery capacity, and it consider that battery health deteriorates over time as it plan for multiple periods.

Keywords: drone-delivery, stochastic demand, VRP, fleet sizing

Procedia PDF Downloads 18
389 Crude Oil and Stocks Markets: Prices and Uncertainty Transmission Analysis

Authors: Kamel Malik Bensafta, Gervasio Semedo

Abstract:

The purpose of this paper is to investigate the relationship between oil prices and socks markets. The empirical analysis in this paper is conducted within the context of Multivariate GARCH models, using a transform version of the so-called BEKK parameterization. We show that mean and uncertainty of US market are transmitted to oil market and European market. We also identify an important transmission from WTI prices to Brent Prices.

Keywords: oil volatility, stock markets, MGARCH, transmission, structural break

Procedia PDF Downloads 490
388 Production Planning for Animal Food Industry under Demand Uncertainty

Authors: Pirom Thangchitpianpol, Suttipong Jumroonrut

Abstract:

This research investigates the distribution of food demand for animal food and the optimum amount of that food production at minimum cost. The data consist of customer purchase orders for the food of laying hens, price of food for laying hens, cost per unit for the food inventory, cost related to food of laying hens in which the food is out of stock, such as fine, overtime, urgent purchase for material. They were collected from January, 1990 to December, 2013 from a factory in Nakhonratchasima province. The collected data are analyzed in order to explore the distribution of the monthly food demand for the laying hens and to see the rate of inventory per unit. The results are used in a stochastic linear programming model for aggregate planning in which the optimum production or minimum cost could be obtained. Programming algorithms in MATLAB and tools in Linprog software are used to get the solution. The distribution of the food demand for laying hens and the random numbers are used in the model. The study shows that the distribution of monthly food demand for laying has a normal distribution, the monthly average amount (unit: 30 kg) of production from January to December. The minimum total cost average for 12 months is Baht 62,329,181.77. Therefore, the production planning can reduce the cost by 14.64% from real cost.

Keywords: animal food, stochastic linear programming, aggregate planning, production planning, demand uncertainty

Procedia PDF Downloads 348
387 Multivariate Rainfall Disaggregation Using MuDRain Model: Malaysia Experience

Authors: Ibrahim Suliman Hanaish

Abstract:

Disaggregation daily rainfall using stochastic models formulated based on multivariate approach (MuDRain) is discussed in this paper. Seven rain gauge stations are considered in this study for different distances from the referred station starting from 4 km to 160 km in Peninsular Malaysia. The hourly rainfall data used are covered the period from 1973 to 2008 and July and November months are considered as an example of dry and wet periods. The cross-correlation among the rain gauges is considered for the available hourly rainfall information at the neighboring stations or not. This paper discussed the applicability of the MuDRain model for disaggregation daily rainfall to hourly rainfall for both sources of cross-correlation. The goodness of fit of the model was based on the reproduction of fitting statistics like the means, variances, coefficients of skewness, lag zero cross-correlation of coefficients and the lag one auto correlation of coefficients. It is found the correlation coefficients based on extracted correlations that was based on daily are slightly higher than correlations based on available hourly rainfall especially for neighboring stations not more than 28 km. The results showed also the MuDRain model did not reproduce statistics very well. In addition, a bad reproduction of the actual hyetographs comparing to the synthetic hourly rainfall data. Mean while, it is showed a good fit between the distribution function of the historical and synthetic hourly rainfall. These discrepancies are unavoidable because of the lowest cross correlation of hourly rainfall. The overall performance indicated that the MuDRain model would not be appropriate choice for disaggregation daily rainfall.

Keywords: rainfall disaggregation, multivariate disaggregation rainfall model, correlation, stochastic model

Procedia PDF Downloads 462
386 Determining the Effects of Wind-Aided Midge Movement on the Probability of Coexistence of Multiple Bluetongue Virus Serotypes in Patchy Environments

Authors: Francis Mugabi, Kevin Duffy, Joseph J. Y. T Mugisha, Obiora Collins

Abstract:

Bluetongue virus (BTV) has 27 serotypes, with some of them coexisting in patchy (different) environments, which make its control difficult. Wind-aided midge movement is a known mechanism in the spread of BTV. However, its effects on the probability of coexistence of multiple BTV serotypes are not clear. Deterministic and stochastic models for r BTV serotypes in n discrete patches connected by midge and/or cattle movement are formulated and analyzed. For the deterministic model without midge and cattle movement, using the comparison principle, it is shown that if the patch reproduction number R0 < 1, i=1,2,...,n, j=1,2,...,r, all serotypes go extinct. If R^j_i0>1, competitive exclusion takes place. Using numerical simulations, it is shown that when the n patches are connected by midge movement, coexistence takes place. To account for demographic and movement variability, the deterministic model is transformed into a continuous-time Markov chain stochastic model. Utilizing a multitype branching process, it is shown that the midge movement can have a large effect on the probability of coexistence of multiple BTV serotypes. The probability of coexistence can be brought to zero when the control interventions that directly kill the adult midges are applied. These results indicate the significance of wind-aided midge movement and vector control interventions on the coexistence and control of multiple BTV serotypes in patchy environments.

Keywords: bluetongue virus, coexistence, multiple serotypes, midge movement, branching process

Procedia PDF Downloads 114
385 Economics of Precision Mechanization in Wine and Table Grape Production

Authors: Dean A. McCorkle, Ed W. Hellman, Rebekka M. Dudensing, Dan D. Hanselka

Abstract:

The motivation for this study centers on the labor- and cost-intensive nature of wine and table grape production in the U.S., and the potential opportunities for precision mechanization using robotics to augment those production tasks that are labor-intensive. The objectives of this study are to evaluate the economic viability of grape production in five U.S. states under current operating conditions, identify common production challenges and tasks that could be augmented with new technology, and quantify a maximum price for new technology that growers would be able to pay. Wine and table grape production is primed for precision mechanization technology as it faces a variety of production and labor issues. Methodology: Using a grower panel process, this project includes the development of a representative wine grape vineyard in five states and a representative table grape vineyard in California. The panels provided production, budget, and financial-related information that are typical for vineyards in their area. Labor costs for various production tasks are of particular interest. Using the data from the representative budget, 10-year projected financial statements have been developed for the representative vineyard and evaluated using a stochastic simulation model approach. Labor costs for selected vineyard production tasks were evaluated for the potential of new precision mechanization technology being developed. These tasks were selected based on a variety of factors, including input from the panel members, and the extent to which the development of new technology was deemed to be feasible. The net present value (NPV) of the labor cost over seven years for each production task was derived. This allowed for the calculation of a maximum price for new technology whereby the NPV of labor costs would equal the NPV of purchasing, owning, and operating new technology. Expected Results: The results from the stochastic model will show the projected financial health of each representative vineyard over the 2015-2024 timeframe. Investigators have developed a preliminary list of production tasks that have the potential for precision mechanization. For each task, the labor requirements, labor costs, and the maximum price for new technology will be presented and discussed. Together, these results will allow technology developers to focus and prioritize their research and development efforts for wine and table grape vineyards, and suggest opportunities to strengthen vineyard profitability and long-term viability using precision mechanization.

Keywords: net present value, robotic technology, stochastic simulation, wine and table grapes

Procedia PDF Downloads 231
384 Pricing, Production and Inventory Policies Manufacturing under Stochastic Demand and Continuous Prices

Authors: Masoud Rabbani, Majede Smizadeh, Hamed Farrokhi-Asl

Abstract:

We study jointly determining prices and production in a multiple period horizon under a general non-stationary stochastic demand with continuous prices. In some periods we need to increase capacity of production to satisfy demand. This paper presents a model to aid multi-period production capacity planning by quantifying the trade-off between product quality and production cost. The product quality is estimated as the statistical variation from the target performances obtained from the output tolerances of the production machines that manufacture the components. We consider different tolerance for different machines that use to increase capacity. The production cost is estimated as the total cost of owning and operating a production facility during the planning horizon.so capacity planning has cost that impact on price. Pricing products often turns out to be difficult to measure them because customers have a reservation price to pay that impact on price and demand. We decide to determine prices and production for periods after enhance capacity and consider reservation price to determine price. First we use an algorithm base on fuzzy set of the optimal objective function values to determine capacity planning by determine maximize interval from upper bound in minimum objectives and define weight for objectives. Then we try to determine inventory and pricing policies. We can use a lemma to solve a problem in MATLAB and find exact answer.

Keywords: price policy, inventory policy, capacity planning, product quality, epsilon -constraint

Procedia PDF Downloads 535
383 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects

Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta

Abstract:

Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.

Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect

Procedia PDF Downloads 191
382 Peak Frequencies in the Collective Membrane Potential of a Hindmarsh-Rose Small-World Neural Network

Authors: Sun Zhe, Ruggero Micheletto

Abstract:

As discussed extensively in many studies, noise in neural networks have an important role in the functioning and time evolution of the system. The mechanism by which noise induce stochastic resonance enhancing and influencing certain operations is not clarified nor is the mechanism of information storage and coding. With the present research we want to study the role of noise, especially focusing on the frequency peaks in a three variable Hindmarsh−Rose Small−World network. We investigated the behaviour of the network to external noises. We demonstrate that a variation of signal to noise ratio of about 10 dB induces an increase in membrane potential signal of about 15%, averaged over the whole network. We also considered the integral of the whole membrane potential as a paradigm of internal noise, the one generated by the brain network. We showed that this internal noise is attenuated with the size of the network or with the number of random connections. By means of Fourier analysis we found that it has distinct peaks of frequencies, moreover, we showed that increasing the size of the network introducing more neurons, reduced the maximum frequencies generated by the network, whereas the increase in the number of random connections (determined by the small-world probability p) led to a trend toward higher frequencies. This study may give clues on how networks utilize noise to alter the collective behaviour of the system in their operations.

Keywords: neural networks, stochastic processes, small-world networks, discrete Fourier analysis

Procedia PDF Downloads 262
381 Joint Optimal Pricing and Lot-Sizing Decisions for an Advance Sales System under Stochastic Conditions

Authors: Maryam Ghoreishi, Christian Larsen

Abstract:

In this paper, we investigate the effect of stochastic inputs on problem of joint optimal pricing and lot-sizing decisions where the inventory cycle is divided into advance and spot sales periods. During the advance sales period, customer can make reservations while customer with reservations can cancel their order. However, during the spot sales period customers receive the order as soon as the order is placed, but they cannot make any reservation or cancellation during that period. We assume that the inter arrival times during the advance sales and spot sales period are exponentially distributed where the arrival rate is decreasing function of price. Moreover, we assume that the number of cancelled reservations is binomially distributed. In addition, we assume that deterioration process follows an exponential distribution. We investigate two cases. First, we consider two-state case where we find the optimal price during the spot sales period and the optimal price during the advance sales period. Next, we develop a generalized case where we extend two-state case also to allow dynamic prices during the spot sales period. We apply the Markov decision theory in order to find the optimal solutions. In addition, for the generalized case, we apply the policy iteration algorithm in order to find the optimal prices, the optimal lot-size and maximum advance sales amount.

Keywords: inventory control, pricing, Markov decision theory, advance sales system

Procedia PDF Downloads 292
380 The Determinants of Enterprise Risk Management: Literature Review, and Future Research

Authors: Sylvester S. Horvey, Jones Mensah

Abstract:

The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.

Keywords: enterprise risk management, determinants, ERM adoption, literature review

Procedia PDF Downloads 138
379 Frequency Domain Decomposition, Stochastic Subspace Identification and Continuous Wavelet Transform for Operational Modal Analysis of Three Story Steel Frame

Authors: Ardalan Sabamehr, Ashutosh Bagchi

Abstract:

Recently, Structural Health Monitoring (SHM) based on the vibration of structures has attracted the attention of researchers in different fields such as: civil, aeronautical and mechanical engineering. Operational Modal Analysis (OMA) have been developed to identify modal properties of infrastructure such as bridge, building and so on. Frequency Domain Decomposition (FDD), Stochastic Subspace Identification (SSI) and Continuous Wavelet Transform (CWT) are the three most common methods in output only modal identification. FDD, SSI, and CWT operate based on the frequency domain, time domain, and time-frequency plane respectively. So, FDD and SSI are not able to display time and frequency at the same time. By the way, FDD and SSI have some difficulties in a noisy environment and finding the closed modes. CWT technique which is currently developed works on time-frequency plane and a reasonable performance in such condition. The other advantage of wavelet transform rather than other current techniques is that it can be applied for the non-stationary signal as well. The aim of this paper is to compare three most common modal identification techniques to find modal properties (such as natural frequency, mode shape, and damping ratio) of three story steel frame which was built in Concordia University Lab by use of ambient vibration. The frame has made of Galvanized steel with 60 cm length, 27 cm width and 133 cm height with no brace along the long span and short space. Three uniaxial wired accelerations (MicroStarin with 100mv/g accuracy) have been attached to the middle of each floor and gateway receives the data and send to the PC by use of Node Commander Software. The real-time monitoring has been performed for 20 seconds with 512 Hz sampling rate. The test is repeated for 5 times in each direction by hand shaking and impact hammer. CWT is able to detect instantaneous frequency by used of ridge detection method. In this paper, partial derivative ridge detection technique has been applied to the local maxima of time-frequency plane to detect the instantaneous frequency. The extracted result from all three methods have been compared, and it demonstrated that CWT has the better performance in term of its accuracy in noisy environment. The modal parameters such as natural frequency, damping ratio and mode shapes are identified from all three methods.

Keywords: ambient vibration, frequency domain decomposition, stochastic subspace identification, continuous wavelet transform

Procedia PDF Downloads 263
378 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control

Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch

Abstract:

As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.

Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids

Procedia PDF Downloads 95
377 Analysis of Urban Rail Transit Station's Accessibility Reliability: A Case Study of Hangzhou Metro, China

Authors: Jin-Qu Chen, Jie Liu, Yong Yin, Zi-Qi Ju, Yu-Yao Wu

Abstract:

Increase in travel fare and station’s failure will have huge impact on passengers’ travel. The Urban Rail Transit (URT) station’s accessibility reliability under increasing travel fare and station failure are analyzed in this paper. Firstly, the passenger’s travel path is resumed based on stochastic user equilibrium and Automatic Fare Collection (AFC) data. Secondly, calculating station’s importance by combining LeaderRank algorithm and Ratio of Station Affected Passenger Volume (RSAPV), and then the station’s accessibility evaluation indicators are proposed based on the analysis of passenger’s travel characteristic. Thirdly, station’s accessibility under different scenarios are measured and rate of accessibility change is proposed as station’s accessibility reliability indicator. Finally, the accessibility of Hangzhou metro stations is analyzed by the formulated models. The result shows that Jinjiang station and Liangzhu station are the most important and convenient station in the Hangzhou metro, respectively. Station failure and increase in travel fare and station failure have huge impact on station’s accessibility, except for increase in travel fare. Stations in Hangzhou metro Line 1 have relatively worse accessibility reliability and Fengqi Road station’s accessibility reliability is weakest. For Hangzhou metro operational department, constructing new metro line around Line 1 and protecting Line 1’s station preferentially can effective improve the accessibility reliability of Hangzhou metro.

Keywords: automatic fare collection data, AFC, station’s accessibility reliability, stochastic user equilibrium, urban rail transit, URT

Procedia PDF Downloads 100