Search results for: time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37389

Search results for: time series data

37269 The Analogue of a Property of Pisot Numbers in Fields of Formal Power Series

Authors: Wiem Gadri

Abstract:

This study delves into the intriguing properties of Pisot and Salem numbers within the framework of formal Laurent series over finite fields, a domain where these numbers’ spectral charac-teristics, Λm(β) and lm(β), have yet to be fully explored. Utilizing a methodological approach that combines algebraic number theory with the analysis of power series, we extend the foundational work of Erdos, Joo, and Komornik to this new setting. Our research uncovers bounds for lm(β), revealing how these depend on the degree of the minimal polynomial of β and thus offering a novel characterization of Pisot and Salem formal power series. The findings significantly contribute to our understanding of these numbers, highlighting their distribution and properties in the context of formal power series. This investigation not only bridges number theory with formal power series analysis but also sets the stage for further interdisciplinary research in these areas.

Keywords: Pisot numbers, Salem numbers, formal power series, over a finite field

Procedia PDF Downloads 25
37268 Impact of Workers’ Remittances on Poverty in Pakistan: A Time Series Analysis by Ardl

Authors: Syed Aziz Rasool, Ayesha Zaman

Abstract:

Poverty is one of the most important problems for any developing nation. Workers’ remittances and investment plays a crucial role in development of any country by reducing the poverty level in Pakistan. This research studies the relationship between workers’ remittances and poverty alleviation. It also focused the significant effect on poverty reduction. This study uses time series data for the period of 1972-2013. Autoregressive Distributed Lag (ARDL)Model and Error Correction (ECM)Model has been used in order to find out the long run and short run relationship between the worker’s remittances and poverty level respectively. Thus, inflow of remittances showed the significant and negative impact on poverty level. Moreover, coefficient of error correction model explains the adjustment towards convergence and it has highly significant and negative value. According to this research, Policy makers should strongly focus on positive and effective policies to attract more remittances. JELCODE: JEL: J61

Keywords: ECM, ARDL, AIC, SC

Procedia PDF Downloads 257
37267 Effects of Financial Development on Economic Growth in South Asia

Authors: Anupam Das

Abstract:

Although financial liberalization has been one of the most important policy prescriptions of international organizations like the World Bank and the IMF, the effect of financial liberalization on economic growth in developing countries is far from unanimous. Since the '80s, South Asian countries made a significant development in liberalization the financial sector. However, due to unavailability of a sufficient number of time series observations, the relationship between economic growth and financial development has not been investigated adequately. We aim to fill this gap by examining time series data of five developing countries from the South Asian region: Bangladesh, India, Pakistan, Sri Lanka, and Nepal. Applying the cointegration tests and Granger causality within the vector error correction model (VECM), we do not find unanimous evidence of financial development on positive economic growth. These results are helpful for developing countries which have been trying to liberalize the financial sector in recent decades.

Keywords: economic growth, financial development, Granger causality, South Asia

Procedia PDF Downloads 348
37266 Mediation of the Middle Eastern Crises and Economic Growth: An Application of Times Series Analysis

Authors: Gokhan Erkal, Gulsen Aydin, Muge Yuce, Lokman Sahin

Abstract:

This study aims to analyze the impacts of involving in mediation of conflicts in the Middle East from the perspective of the economic growth of the mediators. The Middle East is a highly volatile region of the world with rampant crises whose affects spill beyond its borders. Therefore, management and resolution of the conflicts in the region are of great significance. Mediation is an instrument used for abating violence and settling dispute. The recourse to mediation has grown to an important degree in recent years. However, for mediators, it is a daunting task to involve in the mediation of the deadlocks in the Middle East. This study tries to shed light on the positive correlation between economic growth of the mediator and the successful outcome of the mediation process to provide motivation for mediators. To this end, first, it briefly introduces the conflicts ongoing in the region and their negative impacts. Second, the methodology, time series analysis, and the data to be used, International Crisis Behavior Project Data, are presented. Third, the empirical test is carried out and the findings are evaluated. The conclusion highlights the benefits of successful mediation for the economic growth of the mediators of Middle Eastern crises.

Keywords: international crises, mediation, Middle East, times series analysis

Procedia PDF Downloads 155
37265 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data

Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu

Abstract:

Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant  of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual  value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.

Keywords: food waste reduction, particle filter, point-of-sales, sustainable development goals, Taylor's law, time series analysis

Procedia PDF Downloads 105
37264 L1-Convergence of Modified Trigonometric Sums

Authors: Sandeep Kaur Chouhan, Jatinderdeep Kaur, S. S. Bhatia

Abstract:

The existence of sine and cosine series as a Fourier series, their L1-convergence seems to be one of the difficult question in theory of convergence of trigonometric series in L1-metric norm. In the literature so far available, various authors have studied the L1-convergence of cosine and sine trigonometric series with special coefficients. In this paper, we present a modified cosine and sine sums and criterion for L1-convergence of these modified sums is obtained. Also, a necessary and sufficient condition for the L1-convergence of the cosine and sine series is deduced as corollaries.

Keywords: conjugate Dirichlet kernel, Dirichlet kernel, L1-convergence, modified sums

Procedia PDF Downloads 326
37263 The Usage of Bridge Estimator for Hegy Seasonal Unit Root Tests

Authors: Huseyin Guler, Cigdem Kosar

Abstract:

The aim of this study is to propose Bridge estimator for seasonal unit root tests. Seasonality is an important factor for many economic time series. Some variables may contain seasonal patterns and forecasts that ignore important seasonal patterns have a high variance. Therefore, it is very important to eliminate seasonality for seasonal macroeconomic data. There are some methods to eliminate the impacts of seasonality in time series. One of them is filtering the data. However, this method leads to undesired consequences in unit root tests, especially if the data is generated by a stochastic seasonal process. Another method to eliminate seasonality is using seasonal dummy variables. Some seasonal patterns may result from stationary seasonal processes, which are modelled using seasonal dummies but if there is a varying and changing seasonal pattern over time, so the seasonal process is non-stationary, deterministic seasonal dummies are inadequate to capture the seasonal process. It is not suitable to use seasonal dummies for modeling such seasonally nonstationary series. Instead of that, it is necessary to take seasonal difference if there are seasonal unit roots in the series. Different alternative methods are proposed in the literature to test seasonal unit roots, such as Dickey, Hazsa, Fuller (DHF) and Hylleberg, Engle, Granger, Yoo (HEGY) tests. HEGY test can be also used to test the seasonal unit root in different frequencies (monthly, quarterly, and semiannual). Another issue in unit root tests is the lag selection. Lagged dependent variables are added to the model in seasonal unit root tests as in the unit root tests to overcome the autocorrelation problem. In this case, it is necessary to choose the lag length and determine any deterministic components (i.e., a constant and trend) first, and then use the proper model to test for seasonal unit roots. However, this two-step procedure might lead size distortions and lack of power in seasonal unit root tests. Recent studies show that Bridge estimators are good in selecting optimal lag length while differentiating nonstationary versus stationary models for nonseasonal data. The advantage of this estimator is the elimination of the two-step nature of conventional unit root tests and this leads a gain in size and power. In this paper, the Bridge estimator is proposed to test seasonal unit roots in a HEGY model. A Monte-Carlo experiment is done to determine the efficiency of this approach and compare the size and power of this method with HEGY test. Since Bridge estimator performs well in model selection, our approach may lead to some gain in terms of size and power over HEGY test.

Keywords: bridge estimators, HEGY test, model selection, seasonal unit root

Procedia PDF Downloads 302
37262 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 133
37261 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 348
37260 Review of Friction Stir Welding of Dissimilar 5000 and 6000 Series Aluminum Alloy Plates

Authors: K. Subbaiah

Abstract:

Friction stir welding is a solid state welding process. Friction stir welding process eliminates the defects found in fusion welding processes. It is environmentally friend process. 5000 and 6000 series aluminum alloys are widely used in the transportation industries. The Al-Mg-Mn (5000) and Al-Mg-Si (6000) alloys are preferably offer best combination of use in Marine construction. The medium strength and high corrosion resistant 5000 series alloys are the aluminum alloys, which are found maximum utility in the world. In this review, the tool pin profile, process parameters such as hardness, yield strength and tensile strength, and microstructural evolution of friction stir welding of Al-Mg alloys 5000 Series and 6000 series have been discussed.

Keywords: 5000 series and 6000 series Al alloys, friction stir welding, tool pin profile, microstructure and properties

Procedia PDF Downloads 434
37259 Forecasting Model to Predict Dengue Incidence in Malaysia

Authors: W. H. Wan Zakiyatussariroh, A. A. Nasuhar, W. Y. Wan Fairos, Z. A. Nazatul Shahreen

Abstract:

Forecasting dengue incidence in a population can provide useful information to facilitate the planning of the public health intervention. Many studies on dengue cases in Malaysia were conducted but are limited in modeling the outbreak and forecasting incidence. This article attempts to propose the most appropriate time series model to explain the behavior of dengue incidence in Malaysia for the purpose of forecasting future dengue outbreaks. Several seasonal auto-regressive integrated moving average (SARIMA) models were developed to model Malaysia’s number of dengue incidence on weekly data collected from January 2001 to December 2011. SARIMA (2,1,1)(1,1,1)52 model was found to be the most suitable model for Malaysia’s dengue incidence with the least value of Akaike information criteria (AIC) and Bayesian information criteria (BIC) for in-sample fitting. The models further evaluate out-sample forecast accuracy using four different accuracy measures. The results indicate that SARIMA (2,1,1)(1,1,1)52 performed well for both in-sample fitting and out-sample evaluation.

Keywords: time series modeling, Box-Jenkins, SARIMA, forecasting

Procedia PDF Downloads 451
37258 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions

Authors: Vikrant Gupta, Amrit Goswami

Abstract:

The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.

Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition

Procedia PDF Downloads 109
37257 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 276
37256 Determining Abnomal Behaviors in UAV Robots for Trajectory Control in Teleoperation

Authors: Kiwon Yeom

Abstract:

Change points are abrupt variations in a data sequence. Detection of change points is useful in modeling, analyzing, and predicting time series in application areas such as robotics and teleoperation. In this paper, a change point is defined to be a discontinuity in one of its derivatives. This paper presents a reliable method for detecting discontinuities within a three-dimensional trajectory data. The problem of determining one or more discontinuities is considered in regular and irregular trajectory data from teleoperation. We examine the geometric detection algorithm and illustrate the use of the method on real data examples.

Keywords: change point, discontinuity, teleoperation, abrupt variation

Procedia PDF Downloads 142
37255 Analysing Time Series for a Forecasting Model to the Dynamics of Aedes Aegypti Population Size

Authors: Flavia Cordeiro, Fabio Silva, Alvaro Eiras, Jose Luiz Acebal

Abstract:

Aedes aegypti is present in the tropical and subtropical regions of the world and is a vector of several diseases such as dengue fever, yellow fever, chikungunya, zika etc. The growth in the number of arboviruses cases in the last decades became a matter of great concern worldwide. Meteorological factors like mean temperature and precipitation are known to influence the infestation by the species through effects on physiology and ecology, altering the fecundity, mortality, lifespan, dispersion behaviour and abundance of the vector. Models able to describe the dynamics of the vector population size should then take into account the meteorological variables. The relationship between meteorological factors and the population dynamics of Ae. aegypti adult females are studied to provide a good set of predictors to model the dynamics of the mosquito population size. The time-series data of capture of adult females of a public health surveillance program from the city of Lavras, MG, Brazil had its association with precipitation, humidity and temperature analysed through a set of statistical methods for time series analysis commonly adopted in Signal Processing, Information Theory and Neuroscience. Cross-correlation, multicollinearity test and whitened cross-correlation were applied to determine in which time lags would occur the influence of meteorological variables on the dynamics of the mosquito abundance. Among the findings, the studied case indicated strong collinearity between humidity and precipitation, and precipitation was selected to form a pair of descriptors together with temperature. In the techniques used, there were observed significant associations between infestation indicators and both temperature and precipitation in short, mid and long terms, evincing that those variables should be considered in entomological models and as public health indicators. A descriptive model used to test the results exhibits a strong correlation to data.

Keywords: Aedes aegypti, cross-correlation, multicollinearity, meteorological variables

Procedia PDF Downloads 151
37254 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area

Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim

Abstract:

In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.

Keywords: data estimation, link data, machine learning, road network

Procedia PDF Downloads 487
37253 Coefficients of Some Double Trigonometric Cosine and Sine Series

Authors: Jatinderdeep Kaur

Abstract:

In this paper, the results of Kano from one-dimensional cosine and sine series are extended to two-dimensional cosine and sine series. To extend these results, some classes of coefficient sequences such as the class of semi convexity and class R are extended from one dimension to two dimensions. Under these extended classes, I have checked the function f(x,y) is two dimensional Fourier Cosine and Sine series or equivalently it represents an integrable function. Further, some results are obtained which are the generalization of Moricz's results.

Keywords: conjugate dirichlet kernel, conjugate fejer kernel, fourier series, semi-convexity

Procedia PDF Downloads 410
37252 Closed Forms of Trigonometric Series Interms of Riemann’s ζ Function and Dirichlet η, λ, β Functions or the Hurwitz Zeta Function and Harmonic Numbers

Authors: Slobodan B. Tričković

Abstract:

We present the results concerned with trigonometric series that include sine and cosine functions with a parameter appearing in the denominator. We derive two types of closed-form formulas for trigonometric series. At first, for some integer values, as we know that Riemann’s ζ function and Dirichlet η, λ equal zero at negative even integers, whereas Dirichlet’s β function equals zero at negative odd integers, after a certain number of members, the rest of the series vanishes. Thus, a trigonometric series becomes a polynomial with coefficients involving Riemann’s ζ function and Dirichlet η, λ, β functions. On the other hand, in some cases, one cannot immediately replace the parameter with any positive integer because we shall encounter singularities. So it is necessary to take a limit, so in the process, we apply L’Hospital’s rule and, after a series of rearrangements, we bring a trigonometric series to a form suitable for the application of Choi-Srivastava’s theorem dealing with Hurwitz’s zeta function and Harmonic numbers. In this way, we express a trigonometric series as a polynomial over Hurwitz’s zeta function derivative.

Keywords: Dirichlet eta lambda beta functions, Riemann's zeta function, Hurwitz zeta function, Harmonic numbers

Procedia PDF Downloads 69
37251 A Bayesian Population Model to Estimate Reference Points of Bombay-Duck (Harpadon nehereus) in Bay of Bengal, Bangladesh Using CMSY and BSM

Authors: Ahmad Rabby

Abstract:

The demographic trend analyses of Bombay-duck from time series catch data using CMSY and BSM for the first time in Bangladesh. During 2000-2018, CMSY indicates average lowest production in 2000 and highest in 2018. This has been used in the estimation of prior biomass by the default rules. Possible 31030 viable trajectories for 3422 r-k pairs were found by the CMSY analysis and the final estimates for intrinsic rate of population increase (r) was 1.19 year-1 with 95% CL= 0.957-1.48 year-1. The carrying capacity(k) of Bombay-duck was 283×103 tons with 95% CL=173×103 - 464×103 tons and MSY was 84.3×103tons year-1, 95% CL=49.1×103-145×103 tons year-1. Results from Bayesian state-space implementation of the Schaefer production model (BSM) using catch & CPUE data, found catchabilitiy coefficient(q) was 1.63 ×10-6 from lcl=1.27×10-6 to ucl=2.10×10-6 and r= 1.06 year-1 with 95% CL= 0.727 - 1.55 year-1, k was 226×103 tons with 95% CL=170×103-301×103 tons and MSY was 60×103 tons year-1 with 95% CL=49.9 ×103- 72.2 ×103 tons year-1. Results for Bombay-duck fishery management based on BSM assessment from time series catch data illustrated that, Fmsy=0.531 with 95% CL =0.364 - 0.775 (if B > 1/2 Bmsy then Fmsy =0.5r); Fmsy=0.531 with 95% CL =0.364-0.775 (r and Fmsy are linearly reduced if B < 1/2Bmsy). Biomass in 2018 was 110×103 tons with 2.5th to 97.5th percentile=82.3-155×103 tons. Relative biomass (B/Bmsy) in last year was 0.972 from 2.5th percentile to 97.5th percentile=0.728 -1.37. Fishing mortality in last year was 0.738 with 2.5th-97.5th percentile=0.525-1.37. Exploitation F/Fmsy was 1.39, from 2.5th to 97.5th percentile it was 0.988 -1.86. The biological reference points of B/BMSY was smaller than 1.0, while F/FMSY was higher than 1.0 revealed an over-exploitation of the fishery, indicating that more conservative management strategies are required for Bombay-duck fishery.

Keywords: biological reference points, catchability coefficient, carrying capacity, intrinsic rate of population increase

Procedia PDF Downloads 104
37250 Forecasting Container Throughput: Using Aggregate or Terminal-Specific Data?

Authors: Gu Pang, Bartosz Gebka

Abstract:

We forecast the demand of total container throughput at the Indonesia’s largest seaport, Tanjung Priok Port. We propose four univariate forecasting models, including SARIMA, the additive Seasonal Holt-Winters, the multiplicative Seasonal Holt-Winters and the Vector Error Correction Model. Our aim is to provide insights into whether forecasting the total container throughput obtained by historical aggregated port throughput time series is superior to the forecasts of the total throughput obtained by summing up the best individual terminal forecasts. We test the monthly port/individual terminal container throughput time series between 2003 and 2013. The performance of forecasting models is evaluated based on Mean Absolute Error and Root Mean Squared Error. Our results show that the multiplicative Seasonal Holt-Winters model produces the most accurate forecasts of total container throughput, whereas SARIMA generates the worst in-sample model fit. The Vector Error Correction Model provides the best model fits and forecasts for individual terminals. Our results report that the total container throughput forecasts based on modelling the total throughput time series are consistently better than those obtained by combining those forecasts generated by terminal-specific models. The forecasts of total throughput until the end of 2018 provide an essential insight into the strategic decision-making on the expansion of port's capacity and construction of new container terminals at Tanjung Priok Port.

Keywords: SARIMA, Seasonal Holt-Winters, Vector Error Correction Model, container throughput

Procedia PDF Downloads 477
37249 Comparative Analysis of the Third Generation of Research Data for Evaluation of Solar Energy Potential

Authors: Claudineia Brazil, Elison Eduardo Jardim Bierhals, Luciane Teresa Salvi, Rafael Haag

Abstract:

Renewable energy sources are dependent on climatic variability, so for adequate energy planning, observations of the meteorological variables are required, preferably representing long-period series. Despite the scientific and technological advances that meteorological measurement systems have undergone in the last decades, there is still a considerable lack of meteorological observations that form series of long periods. The reanalysis is a system of assimilation of data prepared using general atmospheric circulation models, based on the combination of data collected at surface stations, ocean buoys, satellites and radiosondes, allowing the production of long period data, for a wide gamma. The third generation of reanalysis data emerged in 2010, among them is the Climate Forecast System Reanalysis (CFSR) developed by the National Centers for Environmental Prediction (NCEP), these data have a spatial resolution of 0.50 x 0.50. In order to overcome these difficulties, it aims to evaluate the performance of solar radiation estimation through alternative data bases, such as data from Reanalysis and from meteorological satellites that satisfactorily meet the absence of observations of solar radiation at global and/or regional level. The results of the analysis of the solar radiation data indicated that the reanalysis data of the CFSR model presented a good performance in relation to the observed data, with determination coefficient around 0.90. Therefore, it is concluded that these data have the potential to be used as an alternative source in locations with no seasons or long series of solar radiation, important for the evaluation of solar energy potential.

Keywords: climate, reanalysis, renewable energy, solar radiation

Procedia PDF Downloads 189
37248 Fuzzy Time Series- Markov Chain Method for Corn and Soybean Price Forecasting in North Carolina Markets

Authors: Selin Guney, Andres Riquelme

Abstract:

Among the main purposes of optimal and efficient forecasts of agricultural commodity prices is to guide the firms to advance the economic decision making process such as planning business operations and marketing decisions. Governments are also the beneficiaries and suppliers of agricultural price forecasts. They use this information to establish a proper agricultural policy, and hence, the forecasts affect social welfare and systematic errors in forecasts could lead to a misallocation of scarce resources. Various empirical approaches have been applied to forecast commodity prices that have used different methodologies. Most commonly-used approaches to forecast commodity sectors depend on classical time series models that assume values of the response variables are precise which is quite often not true in reality. Recently, this literature has mostly evolved to a consideration of fuzzy time series models that provide more flexibility in terms of the classical time series models assumptions such as stationarity, and large sample size requirement. Besides, fuzzy modeling approach allows decision making with estimated values under incomplete information or uncertainty. A number of fuzzy time series models have been developed and implemented over the last decades; however, most of them are not appropriate for forecasting repeated and nonconsecutive transitions in the data. The modeling scheme used in this paper eliminates this problem by introducing Markov modeling approach that takes into account both the repeated and nonconsecutive transitions. Also, the determination of length of interval is crucial in terms of the accuracy of forecasts. The problem of determining the length of interval arbitrarily is overcome and a methodology to determine the proper length of interval based on the distribution or mean of the first differences of series to improve forecast accuracy is proposed. The specific purpose of this paper is to propose and investigate the potential of a new forecasting model that integrates methodologies for determining the proper length of interval based on the distribution or mean of the first differences of series and Fuzzy Time Series- Markov Chain model. Moreover, the accuracy of the forecasting performance of proposed integrated model is compared to different univariate time series models and the superiority of proposed method over competing methods in respect of modelling and forecasting on the basis of forecast evaluation criteria is demonstrated. The application is to daily corn and soybean prices observed at three commercially important North Carolina markets; Candor, Cofield and Roaring River for corn and Fayetteville, Cofield and Greenville City for soybeans respectively. One main conclusion from this paper is that using fuzzy logic improves the forecast performance and accuracy; the effectiveness and potential benefits of the proposed model is confirmed with small selection criteria value such MAPE. The paper concludes with a discussion of the implications of integrating fuzzy logic and nonarbitrary determination of length of interval for the reliability and accuracy of price forecasts. The empirical results represent a significant contribution to our understanding of the applicability of fuzzy modeling in commodity price forecasts.

Keywords: commodity, forecast, fuzzy, Markov

Procedia PDF Downloads 198
37247 Time Series Modelling for Forecasting Wheat Production and Consumption of South Africa in Time of War

Authors: Yiseyon Hosu, Joseph Akande

Abstract:

Wheat is one of the most important staple food grains of human for centuries and is largely consumed in South Africa. It has a special place in the South African economy because of its significance in food security, trade, and industry. This paper modelled and forecast the production and consumption of wheat in South Africa in the time covid-19 and the ongoing Russia-Ukraine war by using annual time series data from 1940–2021 based on the ARIMA models. Both the averaging forecast and selected models forecast indicate that there is the possibility of an increase with respect to production. The minimum and maximum growth in production is projected to be between 3million and 10 million tons, respectively. However, the model also forecast a possibility of depression with respect to consumption in South Africa. Although Covid-19 and the war between Ukraine and Russia, two major producers and exporters of global wheat, are having an effect on the volatility of the prices currently, the wheat production in South African is expected to increase and meat the consumption demand and provided an opportunity for increase export with respect to domestic consumption. The forecasting of production and consumption behaviours of major crops play an important role towards food and nutrition security, these findings can assist policymakers and will provide them with insights into the production and pricing policy of wheat in South Africa.

Keywords: ARIMA, food security, price volatility, staple food, South Africa

Procedia PDF Downloads 71
37246 Volatility and Stylized Facts

Authors: Kalai Lamia, Jilani Faouzi

Abstract:

Measuring and controlling risk is one of the most attractive issues in finance. With the persistence of uncontrolled and erratic stocks movements, volatility is perceived as a barometer of daily fluctuations. An objective measure of this variable seems then needed to control risks and cover those that are considered the most important. Non-linear autoregressive modeling is our first evaluation approach. In particular, we test the presence of “persistence” of conditional variance and the presence of a degree of a leverage effect. In order to resolve for the problem of “asymmetry” in volatility, the retained specifications point to the importance of stocks reactions in response to news. Effects of shocks on volatility highlight also the need to study the “long term” behaviour of conditional variance of stocks returns and articulate the presence of long memory and dependence of time series in the long run. We note that the integrated fractional autoregressive model allows for representing time series that show long-term conditional variance thanks to fractional integration parameters. In order to stop at the dynamics that manage time series, a comparative study of the results of the different models will allow for better understanding volatility structure over the Tunisia stock market, with the aim of accurately predicting fluctuation risks.

Keywords: asymmetry volatility, clustering, stylised facts, leverage effect

Procedia PDF Downloads 276
37245 Entropy-Based Multichannel Stationary Measure for Characterization of Non-Stationary Patterns

Authors: J. D. Martínez-Vargas, C. Castro-Hoyos, G. Castellanos-Dominguez

Abstract:

In this work, we propose a novel approach for measuring the stationarity level of a multichannel time-series. This measure is based on a stationarity definition over time-varying spectrum, and it is aimed to quantify the relation between local stationarity (single-channel) and global dynamic behavior (multichannel dynamics). To assess the proposed approach validity, we use a well known EEG-BCI database, that was constructed for separate between motor/imagery tasks. Thus, based on the statement that imagination of movements implies an increase on the EEG dynamics, we use as discriminant features the proposed measure computed over an estimation of the non-stationary components of input time-series. As measure of separability we use a t-student test, and the obtained results evidence that such measure is able to accurately detect the brain areas projected on the scalp where motor tasks are realized.

Keywords: stationary measure, entropy, sub-space projection, multichannel dynamics

Procedia PDF Downloads 377
37244 Implementation in Python of a Method to Transform One-Dimensional Signals in Graphs

Authors: Luis Andrey Fajardo Fajardo

Abstract:

We are immersed in complex systems. The human brain, the galaxies, the snowflakes are examples of complex systems. An area of interest in Complex systems is the chaos theory. This revolutionary field of science presents different ways of study than determinism and reductionism. Here is where in junction with the Nonlinear DSP, chaos theory offer valuable techniques that establish a link between time series and complex theory in terms of complex networks, so that, the study of signals can be explored from the graph theory. Recently, some people had purposed a method to transform time series in graphs, but no one had developed a suitable implementation in Python with signals extracted from Chaotic Systems or Complex systems. That’s why the implementation in Python of an existing method to transform one dimensional chaotic signals from time domain to graph domain and some measures that may reveal information not extracted in the time domain is proposed.

Keywords: Python, complex systems, graph theory, dynamical systems

Procedia PDF Downloads 484
37243 Comparison of Rainfall Trends in the Western Ghats and Coastal Region of Karnataka, India

Authors: Vinay C. Doranalu, Amba Shetty

Abstract:

In recent days due to climate change, there is a large variation in spatial distribution of daily rainfall within a small region. Rainfall is one of the main end climatic variables which affect spatio-temporal patterns of water availability. The real task postured by the change in climate is identification, estimation and understanding the uncertainty of rainfall. This study intended to analyze the spatial variations and temporal trends of daily precipitation using high resolution (0.25º x 0.25º) gridded data of Indian Meteorological Department (IMD). For the study, 38 grid points were selected in the study area and analyzed for daily precipitation time series (113 years) over the period 1901-2013. Grid points were divided into two zones based on the elevation and situated location of grid points: Low Land (exposed to sea and low elevated area/ coastal region) and High Land (Interior from sea and high elevated area/western Ghats). Time series were applied to examine the spatial analysis and temporal trends in each grid points by non-parametric Mann-Kendall test and Theil-Sen estimator to perceive the nature of trend and magnitude of slope in trend of rainfall. Pettit-Mann-Whitney test is applied to detect the most probable change point in trends of the time period. Results have revealed remarkable monotonic trend in each grid for daily precipitation of the time series. In general, by the regional cluster analysis found that increasing precipitation trend in shoreline region and decreasing trend in Western Ghats from recent years. Spatial distribution of rainfall can be partly explained by heterogeneity in temporal trends of rainfall by change point analysis. The Mann-Kendall test shows significant variation as weaker rainfall towards the rainfall distribution over eastern parts of the Western Ghats region of Karnataka.

Keywords: change point analysis, coastal region India, gridded rainfall data, non-parametric

Procedia PDF Downloads 268
37242 Development and Evaluation of Virtual Basketball Game Using Motion Capture Technology

Authors: Shunsuke Aoki, Taku Ri, Tatsuya Yamazaki

Abstract:

These days, along with the development of e-sports, video games as a competitive sport is attracting attention. But, in many cases, action in the screen does not match the real motion of operation. Inclusiveness of player motion is needed to increase reality and excitement for sports games. Therefore, in this study, the authors propose a method to recognize player motion by using the motion capture technology and develop a virtual basketball game. The virtual basketball game consists of a screen with nine targets, players, depth sensors, and no ball. The players pretend a two-handed basketball shot without a ball aiming at one of the nine targets on the screen. Time-series data of three-dimensional coordinates of player joints are captured by the depth sensor. 20 joints data are measured for each player to estimate the shooting motion in real-time. The trajectory of the thrown virtual ball is calculated based on the time-series data and hitting on the target is judged as success or failure. The virtual basketball game can be played by 2 to 4 players as a competitive game among the players. The developed game was exhibited to the public for evaluation on the authors' university open campus days. 339 visitors participated in the exhibition and enjoyed the virtual basketball game over the two days. A questionnaire survey on the developed game was conducted for the visitors who experienced the game. As a result of the survey, about 97.3% of the players found the game interesting regardless of whether they had experienced actual basketball before or not. In addition, it is found that women are easy to comfort for shooting motion. The virtual game with motion capture technology has the potential to become a universal entertainment between e-sports and actual sports.

Keywords: basketball, motion capture, questionnaire survey, video ga

Procedia PDF Downloads 100
37241 Modelling Structural Breaks in Stock Price Time Series Using Stochastic Differential Equations

Authors: Daniil Karzanov

Abstract:

This paper studies the effect of quarterly earnings reports on the stock price. The profitability of the stock is modeled by geometric Brownian diffusion and the Constant Elasticity of Variance model. We fit several variations of stochastic differential equations to the pre-and after-report period using the Maximum Likelihood Estimation and Grid Search of parameters method. By examining the change in the model parameters after reports’ publication, the study reveals that the reports have enough evidence to be a structural breakpoint, meaning that all the forecast models exploited are not applicable for forecasting and should be refitted shortly.

Keywords: stock market, earnings reports, financial time series, structural breaks, stochastic differential equations

Procedia PDF Downloads 168
37240 Environmental Impact Assessment in Mining Regions with Remote Sensing

Authors: Carla Palencia-Aguilar

Abstract:

Calculations of Net Carbon Balance can be obtained by means of Net Biome Productivity (NBP), Net Ecosystem Productivity (NEP), and Net Primary Production (NPP). The latter is an important component of the biosphere carbon cycle and is easily obtained data from MODIS MOD17A3HGF; however, the results are only available yearly. To overcome data availability, bands 33 to 36 from MODIS MYD021KM (obtained on a daily basis) were analyzed and compared with NPP data from the years 2000 to 2021 in 7 sites where surface mining takes place in the Colombian territory. Coal, Gold, Iron, and Limestone were the minerals of interest. Scales and Units as well as thermal anomalies, were considered for net carbon balance per location. The NPP time series from the satellite images were filtered by using two Matlab filters: First order and Discrete Transfer. After filtering the NPP time series, comparing the graph results from the satellite’s image value, and running a linear regression, the results showed R2 from 0,72 to 0,85. To establish comparable units among NPP and bands 33 to 36, the Greenhouse Gas Equivalencies Calculator by EPA was used. The comparison was established in two ways: one by the sum of all the data per point per year and the other by the average of 46 weeks and finding the percentage that the value represented with respect to NPP. The former underestimated the total CO2 emissions. The results also showed that coal and gold mining in the last 22 years had less CO2 emissions than limestone, with an average per year of 143 kton CO2 eq for gold, 152 kton CO2 eq for coal, and 287 kton CO2 eq for iron. Limestone emissions varied from 206 to 441 kton CO2 eq. The maximum emission values from unfiltered data correspond to 165 kton CO2 eq. for gold, 188 kton CO2 eq. for coal, and 310 kton CO2 eq. for iron and limestone, varying from 231 to 490 kton CO2 eq. If the most pollutant limestone site improves its production technology, limestone could count with a maximum of 318 kton CO2 eq emissions per year, a value very similar respect to iron. The importance of gathering data is to establish benchmarks in order to attain 2050’s zero emissions goal.

Keywords: carbon dioxide, NPP, MODIS, MINING

Procedia PDF Downloads 69