Search results for: methodical series
2641 Co-Integration Model for Predicting Inflation Movement in Nigeria
Authors: Salako Rotimi, Oshungade Stephen, Ojewoye Opeyemi
Abstract:
The maintenance of price stability is one of the macroeconomic challenges facing Nigeria as a nation. This paper attempts to build a co-integration multivariate time series model for inflation movement in Nigeria using data extracted from the abstract of statistics of the Central Bank of Nigeria (CBN) from 2008 to 2017. The Johansen cointegration test suggests at least one co-integration vector describing the long run relationship between Consumer Price Index (CPI), Food Price Index (FPI) and Non-Food Price Index (NFPI). All three series show increasing pattern, which indicates a sign of non-stationary in each of the series. Furthermore, model predictability was established with root-mean-square-error, mean absolute error, mean average percentage error, and Theil’s unbiased statistics for n-step forecasting. The result depicts that the long run coefficient of a consumer price index (CPI) has a positive long-run relationship with the food price index (FPI) and non-food price index (NFPI).Keywords: economic, inflation, model, series
Procedia PDF Downloads 2442640 Design and Implementation of Partial Denoising Boundary Image Matching Using Indexing Techniques
Authors: Bum-Soo Kim, Jin-Uk Kim
Abstract:
In this paper, we design and implement a partial denoising boundary image matching system using indexing techniques. Converting boundary images to time-series makes it feasible to perform fast search using indexes even on a very large image database. Thus, using this converting method we develop a client-server system based on the previous partial denoising research in the GUI (graphical user interface) environment. The client first converts a query image given by a user to a time-series and sends denoising parameters and the tolerance with this time-series to the server. The server identifies similar images from the index by evaluating a range query, which is constructed using inputs given from the client, and sends the resulting images to the client. Experimental results show that our system provides much intuitive and accurate matching result.Keywords: boundary image matching, indexing, partial denoising, time-series matching
Procedia PDF Downloads 1372639 Determination of Surface Deformations with Global Navigation Satellite System Time Series
Authors: Ibrahim Tiryakioglu, Mehmet Ali Ugur, Caglar Ozkaymak
Abstract:
The development of GNSS technology has led to increasingly widespread and successful applications of GNSS surveys for monitoring crustal movements. However, multi-period GPS survey solutions have not been applied in monitoring vertical surface deformation. This study uses long-term GNSS time series that are required to determine vertical deformations. In recent years, the surface deformations that are parallel and semi-parallel to Bolvadin fault have occurred in Western Anatolia. These surface deformations have continued to occur in Bolvadin settlement area that is located mostly on alluvium ground. Due to these surface deformations, a number of cracks in the buildings located in the residential areas and breaks in underground water and sewage systems have been observed. In order to determine the amount of vertical surface deformations, two continuous GNSS stations have been established in the region. The stations have been operating since 2015 and 2017, respectively. In this study, GNSS observations from the mentioned two GNSS stations were processed with GAMIT/GLOBK (GNSS Analysis Massachusetts Institute of Technology/GLOBal Kalman) program package to create a coordinate time series. With the time series analyses, the GNSS stations’ behavior models (linear, periodical, etc.), the causes of these behaviors, and mathematical models were determined. The study results from the time series analysis of these two 2 GNSS stations shows approximately 50-80 mm/yr vertical movement.Keywords: Bolvadin fault, GAMIT, GNSS time series, surface deformations
Procedia PDF Downloads 1652638 Greyscale: A Tree-Based Taxonomy for Grey Literature Published by Fisheries Agencies
Authors: Tatiana Tunon, Gottfried Pestal
Abstract:
Government agencies responsible for the management of fisheries resources publish many types of grey literature, and these materials are increasingly accessible to the public on agency websites. However, scope and quality vary considerably, and end-users need meta-data about the report series when deciding whether to use the information (e.g. apply the methods, include the results in a systematic review), or when prioritizing materials for archiving (e.g. library holdings, reference databases). A proposed taxonomy for these report series was developed based on a review of 41 report series from 6 government agencies in 4 countries (Canada, New Zealand, Scotland, and United States). Each report series was categorized according to multiple criteria describing peer-review process, content, and purpose. A robust classification tree was then fitted to these descriptions, and the resulting taxonomic groups were used to compare agency output from 4 countries using reports available in their online repositories.Keywords: classification tree, fisheries, government, grey literature
Procedia PDF Downloads 2822637 An Approach for Pattern Recognition and Prediction of Information Diffusion Model on Twitter
Authors: Amartya Hatua, Trung Nguyen, Andrew Sung
Abstract:
In this paper, we study the information diffusion process on Twitter as a multivariate time series problem. Our model concerns three measures (volume, network influence, and sentiment of tweets) based on 10 features, and we collected 27 million tweets to build our information diffusion time series dataset for analysis. Then, different time series clustering techniques with Dynamic Time Warping (DTW) distance were used to identify different patterns of information diffusion. Finally, we built the information diffusion prediction models for new hashtags which comprise two phrases: The first phrase is recognizing the pattern using k-NN with DTW distance; the second phrase is building the forecasting model using the traditional Autoregressive Integrated Moving Average (ARIMA) model and the non-linear recurrent neural network of Long Short-Term Memory (LSTM). Preliminary results of performance evaluation between different forecasting models show that LSTM with clustering information notably outperforms other models. Therefore, our approach can be applied in real-world applications to analyze and predict the information diffusion characteristics of selected topics or memes (hashtags) in Twitter.Keywords: ARIMA, DTW, information diffusion, LSTM, RNN, time series clustering, time series forecasting, Twitter
Procedia PDF Downloads 3912636 Time Series Analysis of Radon Concentration at Different Depths in an Underground Goldmine
Authors: Theophilus Adjirackor, Frederic Sam, Irene Opoku-Ntim, David Okoh Kpeglo, Prince K. Gyekye, Frank K. Quashie, Kofi Ofori
Abstract:
Indoor radon concentrations were collected monthly over a period of one year in 10 different levels in an underground goldmine, and the data was analyzed using a four-moving average time series to determine the relationship between the depths of the underground mine and the indoor radon concentration. The detectors were installed in batches within four quarters. The measurements were carried out using LR115 solid-state nuclear track detectors. Statistical models are applied in the prediction and analysis of the radon concentration at various depths. The time series model predicted a positive relationship between the depth of the underground mine and the indoor radon concentration. Thus, elevated radon concentrations are expected at deeper levels of the underground mine, but the relationship was insignificant at the 5% level of significance with a negative adjusted R2 (R2 = – 0.021) due to an appropriate engineering and adequate ventilation rate in the underground mine.Keywords: LR115, radon concentration, rime series, underground goldmine
Procedia PDF Downloads 452635 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 2412634 Performance Evaluation of the Classic seq2seq Model versus a Proposed Semi-supervised Long Short-Term Memory Autoencoder for Time Series Data Forecasting
Authors: Aswathi Thrivikraman, S. Advaith
Abstract:
The study is aimed at designing encoders for deciphering intricacies in time series data by redescribing the dynamics operating on a lower-dimensional manifold. A semi-supervised LSTM autoencoder is devised and investigated to see if the latent representation of the time series data can better forecast the data. End-to-end training of the LSTM autoencoder, together with another LSTM network that is connected to the latent space, forces the hidden states of the encoder to represent the most meaningful latent variables relevant for forecasting. Furthermore, the study compares the predictions with those of a traditional seq2seq model.Keywords: LSTM, autoencoder, forecasting, seq2seq model
Procedia PDF Downloads 1552633 Forecasting the Volatility of Geophysical Time Series with Stochastic Volatility Models
Authors: Maria C. Mariani, Md Al Masum Bhuiyan, Osei K. Tweneboah, Hector G. Huizar
Abstract:
This work is devoted to the study of modeling geophysical time series. A stochastic technique with time-varying parameters is used to forecast the volatility of data arising in geophysics. In this study, the volatility is defined as a logarithmic first-order autoregressive process. We observe that the inclusion of log-volatility into the time-varying parameter estimation significantly improves forecasting which is facilitated via maximum likelihood estimation. This allows us to conclude that the estimation algorithm for the corresponding one-step-ahead suggested volatility (with ±2 standard prediction errors) is very feasible since it possesses good convergence properties.Keywords: Augmented Dickey Fuller Test, geophysical time series, maximum likelihood estimation, stochastic volatility model
Procedia PDF Downloads 3152632 Rescaled Range Analysis of Seismic Time-Series: Example of the Recent Seismic Crisis of Alhoceima
Authors: Marina Benito-Parejo, Raul Perez-Lopez, Miguel Herraiz, Carolina Guardiola-Albert, Cesar Martinez
Abstract:
Persistency, long-term memory and randomness are intrinsic properties of time-series of earthquakes. The Rescaled Range Analysis (RS-Analysis) was introduced by Hurst in 1956 and modified by Mandelbrot and Wallis in 1964. This method represents a simple and elegant analysis which determines the range of variation of one natural property (the seismic energy released in this case) in a time interval. Despite the simplicity, there is complexity inherent in the property measured. The cumulative curve of the energy released in time is the well-known fractal geometry of a devil’s staircase. This geometry is used for determining the maximum and minimum value of the range, which is normalized by the standard deviation. The rescaled range obtained obeys a power-law with the time, and the exponent is the Hurst value. Depending on this value, time-series can be classified in long-term or short-term memory. Hence, an algorithm has been developed for compiling the RS-Analysis for time series of earthquakes by days. Completeness time distribution and locally stationarity of the time series are required. The interest of this analysis is their application for a complex seismic crisis where different earthquakes take place in clusters in a short period. Therefore, the Hurst exponent has been obtained for the seismic crisis of Alhoceima (Mediterranean Sea) of January-March, 2016, where at least five medium-sized earthquakes were triggered. According to the values obtained from the Hurst exponent for each cluster, a different mechanical origin can be detected, corroborated by the focal mechanisms calculated by the official institutions. Therefore, this type of analysis not only allows an approach to a greater understanding of a seismic series but also makes possible to discern different types of seismic origins.Keywords: Alhoceima crisis, earthquake time series, Hurst exponent, rescaled range analysis
Procedia PDF Downloads 3212631 Time Series Analysis on the Production of Fruit Juice: A Case Study of National Horticultural Research Institute (Nihort) Ibadan, Oyo State
Authors: Abiodun Ayodele Sanyaolu
Abstract:
The research was carried out to investigate the time series analysis on quarterly production of fruit juice at the National Horticultural Research Institute Ibadan from 2010 to 2018. Documentary method of data collection was used, and the method of least square and moving average were used in the analysis. From the calculation and the graph, it was glaring that there was increase, decrease, and uniform movements in both the graph of the original data and the tabulated quarter values of the original data. Time series analysis was used to detect the trend in the highest number of fruit juice and it appears to be good over a period of time and the methods used to forecast are additive and multiplicative models. Since it was observed that the production of fruit juice is usually high in January of every year, it is strongly advised that National Horticultural Research Institute should make more provision for fruit juice storage outside this period of the year.Keywords: fruit juice, least square, multiplicative models, time series
Procedia PDF Downloads 1422630 Enhancing Learners' Metacognitive, Cultural and Linguistic Proficiency through Egyptian Series
Authors: Hanan Eltayeb, Reem Al Refaie
Abstract:
To be able to connect and relate to shows spoken in a foreign language, advanced learners must understand not only linguistics inferences but also cultural, metacognitive, and pragmatic connotations in colloquial Egyptian TV series. These connotations are needed to both understand the different facets of the dramas put before them, and they’re also consistently grown and formulated through watching these shows. The inferences have become a staple in the Egyptian colloquial culture over the years, making their way into day-to-day conversations as Egyptians use them to speak, relate, joke, and connect with each other, without having known one another from previous times. As for advanced learners, they need to understand these inferences not only to watch these shows, but also to be able to converse with Egyptians on a level that surpasses the formal, or standard. When faced with some of the somewhat recent shows on the Egyptian screens, learners faced challenges in understanding pragmatics, cultural, and religious background of the target language and consequently not able to interact effectively with a native speaker in real-life situations. This study aims to enhance the linguistic and cultural proficiency of learners through studying two genres of TV Colloquial Egyptian series. Study samples derived from two recent comedian and social Egyptian series ('The Seventh Neighbor' سابع جار, and 'Nelly and Sherihan' نيللي و شريهان). When learners watch such series, they are usually faced with a problem understanding inferences that have to do with social, religious, and political events that are addressed in the series. Using discourse analysis of the sematic, semantic, pragmatic, cultural, and linguistic characteristics of the target language, some major deductions were highlighted and repeated, showing a pattern in both. The research paper concludes that there are many sets of lingual and para-lingual phrases, idioms, and proverbs to be acquired and used effectively by teaching these series. The strategies adopted in the study can be applied to different types of media, like movies, TV shows, and even cartoons, to enhance student proficiency.Keywords: Egyptian series, culture, linguistic competence, pragmatics, semantics, social
Procedia PDF Downloads 1432629 Representation of Agamben's Concept of 'Homo Sacer': Interpretative Analysis in Turkish TV Series Based on Turkey's 1980 Military Coup
Authors: Oyku Yenen
Abstract:
The notion of biopolitics, as studied by such intellectuals as Foucault, Agamben, and Negri, is an important guide for comprehending the current understanding of politics. While Foucault evaluates biopolitics as a survival policy, Giorgio Agamben, German legist, identifies the theory with death. Agamben claims the fact we can all considered to be homo sacer who are abandoned by the law, left in the field of exception, and whose killing does not require punishment. Agamben defines the person who is tried by the public for committing a crime but is not allowed to be sacrificed and whose killing is not considered a crime, as 'homo sacer'. This study analyzes how the concept of 'homo sacer' is made visible in TV series such as Çemberimde Gül Oya (Cagan Irmak, 2005-2005), Hatırla Sevgili (Ummu Burhan, 2006-2008), Bu Kalp Seni Unutur Mu? (Aydin Bulut, 2009-1010) all of which portray the period Turkey's 1980 military coup, within the framework of Agamben's thoughts and notions about biopolitics. When the main plots of these abovementioned TV series, which constitute the universe of this study, are scrutinized closely, they lay out the understanding of politics that has existed throughout history and prevails today. Although there is a large number of TV series on the coup of 1980, these three series are the only main productions that specifically focused on the event itself. Our final analysis will reveal that the concepts of homo sacer, bare life, exception, camp have been embodied in different ways in these three series. In these three series, which all deal with similar subjects using differing perspectives, the dominant understanding of politics is clearly conveyed to the audience. In all three series, the reigning power always decides on the exceptions, those who will live, those who will die, and those who will be ignored by law. Such characters as Mehmet, Sinan, Yıldız, Deniz, Defne, all of which we come across in these series, are on trial as a criminals of thought and are subjected to various forms of torture while isolated in an area where they are virtually deprived of law. Their citizenship rights are revoked. All of them are left alone with their bare lives (zoe).Keywords: bare life, biopolitics, homo sacer, sovereign power, state of exception
Procedia PDF Downloads 1312628 Elucidation of the Sequential Transcriptional Activity in Escherichia coli Using Time-Series RNA-Seq Data
Authors: Pui Shan Wong, Kosuke Tashiro, Satoru Kuhara, Sachiyo Aburatani
Abstract:
Functional genomics and gene regulation inference has readily expanded our knowledge and understanding of gene interactions with regards to expression regulation. With the advancement of transcriptome sequencing in time-series comes the ability to study the sequential changes of the transcriptome. This method presented here works to augment existing regulation networks accumulated in literature with transcriptome data gathered from time-series experiments to construct a sequential representation of transcription factor activity. This method is applied on a time-series RNA-Seq data set from Escherichia coli as it transitions from growth to stationary phase over five hours. Investigations are conducted on the various metabolic activities in gene regulation processes by taking advantage of the correlation between regulatory gene pairs to examine their activity on a dynamic network. Especially, the changes in metabolic activity during phase transition are analyzed with focus on the pagP gene as well as other associated transcription factors. The visualization of the sequential transcriptional activity is used to describe the change in metabolic pathway activity originating from the pagP transcription factor, phoP. The results show a shift from amino acid and nucleic acid metabolism, to energy metabolism during the transition to stationary phase in E. coli.Keywords: Escherichia coli, gene regulation, network, time-series
Procedia PDF Downloads 3722627 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture
Authors: Thrivikraman Aswathi, S. Advaith
Abstract:
As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.Keywords: GAN, transformer, classification, multivariate time series
Procedia PDF Downloads 1302626 Series-Parallel Systems Reliability Optimization Using Genetic Algorithm and Statistical Analysis
Authors: Essa Abrahim Abdulgader Saleem, Thien-My Dao
Abstract:
The main objective of this paper is to optimize series-parallel system reliability using Genetic Algorithm (GA) and statistical analysis; considering system reliability constraints which involve the redundant numbers of selected components, total cost, and total weight. To perform this work, firstly the mathematical model which maximizes system reliability subject to maximum system cost and maximum system weight constraints is presented; secondly, a statistical analysis is used to optimize GA parameters, and thirdly GA is used to optimize series-parallel systems reliability. The objective is to determine the strategy choosing the redundancy level for each subsystem to maximize the overall system reliability subject to total cost and total weight constraints. Finally, the series-parallel system case study reliability optimization results are showed, and comparisons with the other previous results are presented to demonstrate the performance of our GA.Keywords: reliability, optimization, meta-heuristic, genetic algorithm, redundancy
Procedia PDF Downloads 3372625 An IM-COH Algorithm Neural Network Optimization with Cuckoo Search Algorithm for Time Series Samples
Authors: Wullapa Wongsinlatam
Abstract:
Back propagation algorithm (BP) is a widely used technique in artificial neural network and has been used as a tool for solving the time series problems, such as decreasing training time, maximizing the ability to fall into local minima, and optimizing sensitivity of the initial weights and bias. This paper proposes an improvement of a BP technique which is called IM-COH algorithm (IM-COH). By combining IM-COH algorithm with cuckoo search algorithm (CS), the result is cuckoo search improved control output hidden layer algorithm (CS-IM-COH). This new algorithm has a better ability in optimizing sensitivity of the initial weights and bias than the original BP algorithm. In this research, the algorithm of CS-IM-COH is compared with the original BP, the IM-COH, and the original BP with CS (CS-BP). Furthermore, the selected benchmarks, four time series samples, are shown in this research for illustration. The research shows that the CS-IM-COH algorithm give the best forecasting results compared with the selected samples.Keywords: artificial neural networks, back propagation algorithm, time series, local minima problem, metaheuristic optimization
Procedia PDF Downloads 1522624 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 4692623 Synthesis and Antiproliferative Activity of 5-Phenyl-N3-(4-fluorophenyl)-4H-1,2,4-triazole-3,4-diamine Derivatives
Authors: L. Mallesha, P. Mallu, B. Veeresh
Abstract:
In the present study, 2, 6-diflurobenzohydrazide and 4-fluorophenylisothiocyanate were used as the starting materials to synthesize 5-phenyl-N3-(4-fluorophenyl)-4H-1, 2, 4-triazole-3, 4-diamine. Further, compound 5-phenyl-N3-(4-fluorophenyl)-4H-1, 2, 4-triazole-3,4-diamine reacted with fluoro substituted benzaldehydes to yield a series of Schiff bases. All the final compounds were characterized using IR, 1H NMR, 13C NMR, MS and elemental analyses. New compounds were evaluated for their antiproliferative effect using the MTT assay method against four human cancer cell lines (K562, COLO-205, MDA-MB231, and IMR-32) for the time period of 24 h. Among the series, few compounds showed good activity on all cell lines, whereas the other compounds in the series exhibited moderate activity.Keywords: Schiff bases, MTT assay, antiproliferative activity, human cancer cell lines, 1, 2, 4-triazoles
Procedia PDF Downloads 3722622 Time Series Analysis the Case of China and USA Trade Examining during Covid-19 Trade Enormity of Abnormal Pricing with the Exchange rate
Authors: Md. Mahadi Hasan Sany, Mumenunnessa Keya, Sharun Khushbu, Sheikh Abujar
Abstract:
Since the beginning of China's economic reform, trade between the U.S. and China has grown rapidly, and has increased since China's accession to the World Trade Organization in 2001. The US imports more than it exports from China, reducing the trade war between China and the U.S. for the 2019 trade deficit, but in 2020, the opposite happens. In international and U.S. trade, Washington launched a full-scale trade war against China in March 2016, which occurred a catastrophic epidemic. The main goal of our study is to measure and predict trade relations between China and the U.S., before and after the arrival of the COVID epidemic. The ML model uses different data as input but has no time dimension that is present in the time series models and is only able to predict the future from previously observed data. The LSTM (a well-known Recurrent Neural Network) model is applied as the best time series model for trading forecasting. We have been able to create a sustainable forecasting system in trade between China and the US by closely monitoring a dataset published by the State Website NZ Tatauranga Aotearoa from January 1, 2015, to April 30, 2021. Throughout the survey, we provided a 180-day forecast that outlined what would happen to trade between China and the US during COVID-19. In addition, we have illustrated that the LSTM model provides outstanding outcome in time series data analysis rather than RFR and SVR (e.g., both ML models). The study looks at how the current Covid outbreak affects China-US trade. As a comparative study, RMSE transmission rate is calculated for LSTM, RFR and SVR. From our time series analysis, it can be said that the LSTM model has given very favorable thoughts in terms of China-US trade on the future export situation.Keywords: RFR, China-U.S. trade war, SVR, LSTM, deep learning, Covid-19, export value, forecasting, time series analysis
Procedia PDF Downloads 1982621 Multi-scale Spatial and Unified Temporal Feature-fusion Network for Multivariate Time Series Anomaly Detection
Authors: Hang Yang, Jichao Li, Kewei Yang, Tianyang Lei
Abstract:
Multivariate time series anomaly detection is a significant research topic in the field of data mining, encompassing a wide range of applications across various industrial sectors such as traffic roads, financial logistics, and corporate production. The inherent spatial dependencies and temporal characteristics present in multivariate time series introduce challenges to the anomaly detection task. Previous studies have typically been based on the assumption that all variables belong to the same spatial hierarchy, neglecting the multi-level spatial relationships. To address this challenge, this paper proposes a multi-scale spatial and unified temporal feature fusion network, denoted as MSUT-Net, for multivariate time series anomaly detection. The proposed model employs a multi-level modeling approach, incorporating both temporal and spatial modules. The spatial module is designed to capture the spatial characteristics of multivariate time series data, utilizing an adaptive graph structure learning model to identify the multi-level spatial relationships between data variables and their attributes. The temporal module consists of a unified temporal processing module, which is tasked with capturing the temporal features of multivariate time series. This module is capable of simultaneously identifying temporal dependencies among different variables. Extensive testing on multiple publicly available datasets confirms that MSUT-Net achieves superior performance on the majority of datasets. Our method is able to model and accurately detect systems data with multi-level spatial relationships from a spatial-temporal perspective, providing a novel perspective for anomaly detection analysis.Keywords: data mining, industrial system, multivariate time series, anomaly detection
Procedia PDF Downloads 152620 Copula Autoregressive Methodology for Simulation of Solar Irradiance and Air Temperature Time Series for Solar Energy Forecasting
Authors: Andres F. Ramirez, Carlos F. Valencia
Abstract:
The increasing interest in renewable energies strategies application and the path for diminishing the use of carbon related energy sources have encouraged the development of novel strategies for integration of solar energy into the electricity network. A correct inclusion of the fluctuating energy output of a photovoltaic (PV) energy system into an electric grid requires improvements in the forecasting and simulation methodologies for solar energy potential, and the understanding not only of the mean value of the series but the associated underlying stochastic process. We present a methodology for synthetic generation of solar irradiance (shortwave flux) and air temperature bivariate time series based on copula functions to represent the cross-dependence and temporal structure of the data. We explore the advantages of using this nonlinear time series method over traditional approaches that use a transformation of the data to normal distributions as an intermediate step. The use of copulas gives flexibility to represent the serial variability of the real data on the simulation and allows having more control on the desired properties of the data. We use discrete zero mass density distributions to assess the nature of solar irradiance, alongside vector generalized linear models for the bivariate time series time dependent distributions. We found that the copula autoregressive methodology used, including the zero mass characteristics of the solar irradiance time series, generates a significant improvement over state of the art strategies. These results will help to better understand the fluctuating nature of solar energy forecasting, the underlying stochastic process, and quantify the potential of a photovoltaic (PV) energy generating system integration into a country electricity network. Experimental analysis and real data application substantiate the usage and convenience of the proposed methodology to forecast solar irradiance time series and solar energy across northern hemisphere, southern hemisphere, and equatorial zones.Keywords: copula autoregressive, solar irradiance forecasting, solar energy forecasting, time series generation
Procedia PDF Downloads 3232619 Design Approach for the Development of Format-Flexible Packaging Machines
Authors: G. Götz, P. Stich, J. Backhaus, G. Reinhart
Abstract:
The rising demand for format-flexible packaging machines is caused by current market changes. Increasing the formatflexibility is a new goal for the packaging machine manufacturers’ product development process. There are no methodical or designorientated tools for a comprehensive consideration of this target. This paper defines the term format-flexibility in the context of packaging machines and shows the state-of-the-art for improving the changeover of production machines. The requirements for a new approach and the concept itself will be introduced, and the method elements will be explained. Finally, the use of the concept and the result of the development of a format-flexible packaging machine will be shown.Keywords: packaging machine, format-flexibility, changeover, design method
Procedia PDF Downloads 4342618 Fractal-Wavelet Based Techniques for Improving the Artificial Neural Network Models
Authors: Reza Bazargan lari, Mohammad H. Fattahi
Abstract:
Natural resources management including water resources requires reliable estimations of time variant environmental parameters. Small improvements in the estimation of environmental parameters would result in grate effects on managing decisions. Noise reduction using wavelet techniques is an effective approach for pre-processing of practical data sets. Predictability enhancement of the river flow time series are assessed using fractal approaches before and after applying wavelet based pre-processing. Time series correlation and persistency, the minimum sufficient length for training the predicting model and the maximum valid length of predictions were also investigated through a fractal assessment.Keywords: wavelet, de-noising, predictability, time series fractal analysis, valid length, ANN
Procedia PDF Downloads 3682617 Islamic Research Methodology (I-Restmo): Eight Series Research Module with Islamic Value Concept
Authors: Noraizah Abu Bakar, Norhayati Alais, Nurdiana Azizan, Fatimah Alwi, Muhammad Zaky Razaly
Abstract:
This is a concise research module with the Islamic values concept proposed to a group of researches, potential researchers, PhD and Master Scholars to prepare themselves for their studies. The intention of designing this module is to help and guide Malaysian citizens to undergone their postgraduate’s studies. This is aligned with the 10th Malaysian plan- MyBrain 15. MyBrain 15 is a financial aid to Malaysian citizens to pursue PhD and Master programs. The program becomes one of Ministry of Education Strategic Plan to ensure by year 2013, there will be 60,000 PhD scholars in Malaysia. This module is suitable for the social science researchers; however it can be useful tool for science technology researchers such as Engineering and Information Technology disciplines too. The module consists of eight (8) series that provides a proper flow of information in doing research with the Islamic Value Application provided in each of the series. This module is designed to produce future researchers with a comprehensive knowledge of humankind and the hereafter. The uniqueness about this research module is designed based on Islamic values concept. Researchers were able to understand the proper research process and simultaneously be able to open their minds to understand Islam more closely. Application of Islamic values in each series could trigger a broader idea for researchers to examine in greater depth of knowledge related to humanities.Keywords: Eight Series Research Module, Islamic Values concept, Teaching Methodology, Flow of Information, Epistemology of research
Procedia PDF Downloads 3992616 Impact of Series Reactive Compensation on Increasing a Distribution Network Distributed Generation Hosting Capacity
Authors: Moataz Ammar, Ahdab Elmorshedy
Abstract:
The distributed generation hosting capacity of a distribution network is typically limited at a given connection point by the upper voltage limit that can be violated due to the injection of active power into the distribution network. The upper voltage limit violation concern becomes more important as the network equivalent resistance increases with respect to its equivalent reactance. This paper investigates the impact of modifying the distribution network equivalent reactance at the point of connection such that the upper voltage limit is violated at a higher distributed generation penetration, than it would without the addition of series reactive compensation. The results show that series reactive compensation proves efficient in certain situations (based on the ratio of equivalent network reactance to equivalent network resistance at the point of connection). As opposed to the conventional case of capacitive compensation of a distribution network to reduce voltage drop, inductive compensation is seen to be more appropriate for alleviation of distributed-generation-induced voltage rise.Keywords: distributed generation, distribution networks, series compensation, voltage rise
Procedia PDF Downloads 3952615 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method
Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri
Abstract:
Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method
Procedia PDF Downloads 5022614 Power Quality Audit Using Fluke Analyzer
Authors: N. Ravikumar, S. Krishnan, B. Yokeshkumar
Abstract:
In present days, the power quality issues are increases due to non-linear loads like fridge, AC, washing machines, induction motor, etc. This power quality issues will affects the output voltages, output current, and output power of the total performance of the generator. This paper explains how to test the generator using the Fluke 435 II series power quality analyser. This Fluke 435 II series power quality analyser is used to measure the voltage, current, power, energy, total harmonic distortion (THD), current harmonics, voltage harmonics, power factor, and frequency. The Fluke 435 II series power quality analyser have several advantages. They are i) it will records output in analog and digital format. ii) the fluke analyzer will records at every 0.25 sec. iii) it will also measure all the electrical parameter at a time.Keywords: THD, harmonics, power quality, TNEB, Fluke 435
Procedia PDF Downloads 1772613 Video on Demand (VOD) Industry in Iran: Study of Reasons of Increasing Film and Series Platforms
Authors: Narges Hamidipour
Abstract:
VOD, which stands for "video on demand", is one kind of watching movies and series on web platforms that, by using them, individuals can access lots of video content by paying abonnement. The first platform in Iran was funded in 2014, and in the last 10 years, it has become the main part of the movie and series industry. There are 374 VOD platforms in Iran, but just three of them are in the mainstream. However, in these years, they have been developed and famed in different ways. This article focuses on the reasons for this development in the past years. For the framework, "digital economy", "media industries," and "political economy" have been used with the interview method. In this research, some experts in SATRA (regulatory organization of inclusive audio and video media in Iran), owners or managers of VODs and some others who directly have been in the system conveyed their opinions. By the way, some documents and analysis statistics are invoked to reach complete results.Keywords: digital economy, political economy, VOD, interview, iran
Procedia PDF Downloads 662612 A Review of Different Studies on Hidden Markov Models for Multi-Temporal Satellite Images: Stationarity and Non-Stationarity Issues
Authors: Ali Ben Abbes, Imed Riadh Farah
Abstract:
Due to the considerable advances in Multi-Temporal Satellite Images (MTSI), remote sensing application became more accurate. Recently, many advances in modeling MTSI are developed using various models. The purpose of this article is to present an overview of studies using Hidden Markov Model (HMM). First of all, we provide a background of using HMM and their applications in this context. A comparison of the different works is discussed, and possible areas and challenges are highlighted. Secondly, we discussed the difference on vegetation monitoring as well as urban growth. Nevertheless, most research efforts have been used only stationary data. From another point of view, in this paper, we describe a new non-stationarity HMM, that is defined with a set of parts of the time series e.g. seasonal, trend and random. In addition, a new approach giving more accurate results and improve the applicability of the HMM in modeling a non-stationary data series. In order to assess the performance of the HMM, different experiments are carried out using Moderate Resolution Imaging Spectroradiometer (MODIS) NDVI time series of the northwestern region of Tunisia and Landsat time series of tres Cantos-Madrid in Spain.Keywords: multi-temporal satellite image, HMM , nonstationarity, vegetation, urban
Procedia PDF Downloads 354