Search results for: empirical distributions
1001 Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis
Authors: Nikolay Nikolaev, Evgueni Smirnov
Abstract:
This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13311000 Investigation on Fischer-Tropsch Synthesis over Cobalt-Gadolinium Catalyst
Authors: Jian Huang, Weixin Qian, Haitao Zhang, Weiyong Ying
Abstract:
Cobalt-gadolinium catalyst for Fischer-Tropsch synthesis was prepared by impregnation method with commercial silica gel, and its texture properties were characterized by BET, XRD, and TPR. The catalytic performance of the catalyst was tested in a fixed bed reactor. The results showed that the addition of gadolinium to the cobalt catalyst might decrease the size of cobalt particles, and increased the dispersion of catalytic active cobalt phases. The carbon number distributions for the catalysts was calculated by ASF equation.Keywords: Fischer-Tropsch synthesis, cobalt-based catalysts, gadolinium, carbon number distributions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608999 A Brief Study about Nonparametric Adherence Tests
Authors: Vinicius R. Domingues, Luan C. S. M. Ozelim
Abstract:
The statistical study has become indispensable for various fields of knowledge. Not any different, in Geotechnics the study of probabilistic and statistical methods has gained power considering its use in characterizing the uncertainties inherent in soil properties. One of the situations where engineers are constantly faced is the definition of a probability distribution that represents significantly the sampled data. To be able to discard bad distributions, goodness-of-fit tests are necessary. In this paper, three non-parametric goodness-of-fit tests are applied to a data set computationally generated to test the goodness-of-fit of them to a series of known distributions. It is shown that the use of normal distribution does not always provide satisfactory results regarding physical and behavioral representation of the modeled parameters.Keywords: Kolmogorov-Smirnov, Anderson-Darling, Cramer-Von-Mises, Nonparametric adherence tests.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843998 Experimental Study of Thermal Environment in a Room with Mixing Ventilation
Authors: Dong-Mei Pan, Liang XIA, Ming-Yin Chan
Abstract:
This paper reports an experimental study on a sleeping thermal manikin in a room equipped with a mixing ventilation system. In the experimental work, heat loss from the sleeping thermal manikin was measured under different conditions. The supply air temperature was in a range of 17°C to 27°C. Apart from the heat loss of the sleeping thermal manikin, the velocity distributions and temperature distributions were also measured in the experiments for subsequent analysis.Keywords: Sleeping Environment, Mixing Ventilation System
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761997 The Effects of Misspecification of Stochastic Processes on Investment Appraisal
Authors: George Yungchih Wang
Abstract:
For decades financial economists have been attempted to determine the optimal investment policy by recognizing the option value embedded in irreversible investment whose project value evolves as a geometric Brownian motion (GBM). This paper aims to examine the effects of the optimal investment trigger and of the misspecification of stochastic processes on investment in real options applications. Specifically, the former explores the consequence of adopting optimal investment rules on the distributions of corporate value under the correct assumption of stochastic process while the latter analyzes the influence on the distributions of corporate value as a result of the misspecification of stochastic processes, i.e., mistaking an alternative process as a GBM. It is found that adopting the correct optimal investment policy may increase corporate value by shifting the value distribution rightward, and the misspecification effect may decrease corporate value by shifting the value distribution leftward. The adoption of the optimal investment trigger has a major impact on investment to such an extent that the downside risk of investment is truncated at the project value of zero, thereby moving the value distributions rightward. The analytical framework is also extended to situations where collection lags are in place, and the result indicates that collection lags reduce the effects of investment trigger and misspecification on investment in an opposite way.
Keywords: GBM, real options, investment trigger, misspecification, collection lags
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513996 Constrained Particle Swarm Optimization of Supply Chains
Authors: András Király, Tamás Varga, János Abonyi
Abstract:
Since supply chains highly impact the financial performance of companies, it is important to optimize and analyze their Key Performance Indicators (KPI). The synergistic combination of Particle Swarm Optimization (PSO) and Monte Carlo simulation is applied to determine the optimal reorder point of warehouses in supply chains. The goal of the optimization is the minimization of the objective function calculated as the linear combination of holding and order costs. The required values of service levels of the warehouses represent non-linear constraints in the PSO. The results illustrate that the developed stochastic simulator and optimization tool is flexible enough to handle complex situations.Keywords: stochastic processes, empirical distributions, Monte Carlo simulation, PSO, supply chain management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075995 A Methodology for Characterising the Tail Behaviour of a Distribution
Authors: Serge Provost, Yishan Zang
Abstract:
Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.
Keywords: Arctangent transformation, change of variables, heavy-tailed distributions, tail classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 687994 Development of Maximum Entropy Method for Prediction of Droplet-size Distribution in Primary Breakup Region of Spray
Authors: E. Movahednejad, F. Ommi
Abstract:
Droplet size distributions in the cold spray of a fuel are important in observed combustion behavior. Specification of droplet size and velocity distributions in the immediate downstream of injectors is also essential as boundary conditions for advanced computational fluid dynamics (CFD) and two-phase spray transport calculations. This paper describes the development of a new model to be incorporated into maximum entropy principle (MEP) formalism for prediction of droplet size distribution in droplet formation region. The MEP approach can predict the most likely droplet size and velocity distributions under a set of constraints expressing the available information related to the distribution. In this article, by considering the mechanisms of turbulence generation inside the nozzle and wave growth on jet surface, it is attempted to provide a logical framework coupling the flow inside the nozzle to the resulting atomization process. The purpose of this paper is to describe the formulation of this new model and to incorporate it into the maximum entropy principle (MEP) by coupling sub-models together using source terms of momentum and energy. Comparison between the model prediction and experimental data for a gas turbine swirling nozzle and an annular spray indicate good agreement between model and experiment.Keywords: Droplet, instability, Size Distribution, Turbulence, Maximum Entropy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2580993 Regionalization of IDF Curves with L-Moments for Storm Events
Authors: Noratiqah Mohd Ariff, Abdul Aziz Jemain, Mohd Aftar Abu Bakar
Abstract:
The construction of Intensity-Duration-Frequency (IDF) curves is one of the most common and useful tools in order to design hydraulic structures and to provide a mathematical relationship between rainfall characteristics. IDF curves, especially those in Peninsular Malaysia, are often built using moving windows of rainfalls. However, these windows do not represent the actual rainfall events since the duration of rainfalls is usually prefixed. Hence, instead of using moving windows, this study aims to find regionalized distributions for IDF curves of extreme rainfalls based on storm events. Homogeneity test is performed on annual maximum of storm intensities to identify homogeneous regions of storms in Peninsular Malaysia. The L-moment method is then used to regionalized Generalized Extreme Value (GEV) distribution of these annual maximums and subsequently. IDF curves are constructed using the regional distributions. The differences between the IDF curves obtained and IDF curves found using at-site GEV distributions are observed through the computation of the coefficient of variation of root mean square error, mean percentage difference and the coefficient of determination. The small differences implied that the construction of IDF curves could be simplified by finding a general probability distribution of each region. This will also help in constructing IDF curves for sites with no rainfall station.
Keywords: IDF curves, L-moments, regionalization, storm events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715992 Empirical Evaluation of Performance Optimization Techniques Used in Mobile Applications
Authors: Nathar Shah, Bu Kiat Seng
Abstract:
Mobile application development is different from regular application development due to the hardware resource limitations existed in the mobile platforms. In the mobile environment, the application needs to be optimized by the developer to produce optimal software with least overhead. This study discussed about performance optimization techniques that are employed in general application development, and how such techniques are performing on mobile platforms through some empirical evaluations on a mobile emulator, Nokia X3-02 and Nokia C5-03devices. The scope of the work is only confined to mobile platform based on Java Mobile edition architecture. The empirical results showed that techniques such as loop unrolling, dependency chain, and linearized getter and setter performed better by a factor of 3 to 7. Whereas declaration and initialization on the same line or separate line did not improve the performance.
Keywords: Optimization Techniques, Mobile Applications, Performance Evaluation, J2ME, Empirical Experiments
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604991 A Novel Instantaneous Frequency Computation Approach for Empirical Mode Decomposition
Authors: Liming Zhang
Abstract:
This paper introduces a new instantaneous frequency computation approach -Counting Instantaneous Frequency for a general class of signals called simple waves. The classsimple wave contains a wide range of continuous signals for which the concept instantaneous frequency has a perfect physical sense. The concept of -Counting Instantaneous Frequency also applies to all the discrete data. For all the simple wave signals and the discrete data, -Counting instantaneous frequency can be computed directly without signal decomposition process. The intrinsic mode functions obtained through empirical mode decomposition belongs to simple wave. So -Counting instantaneous frequency can be used together with empirical mode decomposition.Keywords: Instantaneous frequency, empirical mode decomposition, intrinsic mode function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1575990 Computer Verification in Cryptography
Authors: Markus Kaiser, Johannes Buchmann
Abstract:
In this paper we explore the application of a formal proof system to verification problems in cryptography. Cryptographic properties concerning correctness or security of some cryptographic algorithms are of great interest. Beside some basic lemmata, we explore an implementation of a complex function that is used in cryptography. More precisely, we describe formal properties of this implementation that we computer prove. We describe formalized probability distributions (o--algebras, probability spaces and condi¬tional probabilities). These are given in the formal language of the formal proof system Isabelle/HOL. Moreover, we computer prove Bayes' Formula. Besides we describe an application of the presented formalized probability distributions to cryptography. Furthermore, this paper shows that computer proofs of complex cryptographic functions are possible by presenting an implementation of the Miller- Rabin primality test that admits formal verification. Our achievements are a step towards computer verification of cryptographic primitives. They describe a basis for computer verification in cryptography. Computer verification can be applied to further problems in crypto-graphic research, if the corresponding basic mathematical knowledge is available in a database.
Keywords: prime numbers, primality tests, (conditional) proba¬bility distributions, formal proof system, higher-order logic, formal verification, Bayes' Formula, Miller-Rabin primality test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181989 On the Analysis of IP Traffic Distribution in the Network of Suranaree University of Technology
Authors: Paramet Nualmuenwai, Chutima Prommak
Abstract:
This paper presents the IP traffic analysis. The traffic was collected from the network of Suranaree University of Technology using the software based on the Simple Network Management Protocol (SNMP). In particular, we analyze the distribution of the aggregated traffic during the hours of peak load and light load. The traffic profiles including the parameters described the traffic distributions were derived. From the statistical analysis applying three different methods, including the Kolmogorov Smirnov test, Anderson Darling test, and Chi-Squared test, we found that the IP traffic distribution is a non-normal distribution and the distributions during the peak load and the light load are different. The experimental study and analysis show high uncertainty of the IP traffic.Keywords: IP traffic analysis, IP traffic distribution, Traffic uncertainty
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1517988 The Dividend Payments for General Claim Size Distributions under Interest Rate
Authors: Li-Li Li, Jinghai Feng, Lixin Song
Abstract:
This paper evaluates the dividend payments for general claim size distributions in the presence of a dividend barrier. The surplus of a company is modeled using the classical risk process perturbed by diffusion, and in addition, it is assumed to accrue interest at a constant rate. After presenting the integro-differential equation with initial conditions that dividend payments satisfies, the paper derives a useful expression of the dividend payments by employing the theory of Volterra equation. Furthermore, the optimal value of dividend barrier is found. Finally, numerical examples illustrate the optimality of optimal dividend barrier and the effects of parameters on dividend payments.Keywords: Dividend payout, Integro-differential equation, Jumpdiffusion model, Volterra equation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801987 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing
Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig
Abstract:
Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.Keywords: Empirical mode decomposition, mode mixing, sifting process, over-sifting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 992986 Ranking - Convex Risk Minimization
Authors: Wojciech Rejchel
Abstract:
The problem of ranking (rank regression) has become popular in the machine learning community. This theory relates to problems, in which one has to predict (guess) the order between objects on the basis of vectors describing their observed features. In many ranking algorithms a convex loss function is used instead of the 0-1 loss. It makes these procedures computationally efficient. Hence, convex risk minimizers and their statistical properties are investigated in this paper. Fast rates of convergence are obtained under conditions, that look similarly to the ones from the classification theory. Methods used in this paper come from the theory of U-processes as well as empirical processes.
Keywords: Convex loss function, empirical risk minimization, empirical process, U-process, boosting, euclidean family.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1414985 A Numerical Simulation of the Indoor Air Flow
Authors: Karel Frana, Jianshun S. Zhang, Milos Muller
Abstract:
The indoor airflow with a mixed natural/forced convection was numerically calculated using the laminar and turbulent approach. The Boussinesq approximation was considered for a simplification of the mathematical model and calculations. The results obtained, such as mean velocity fields, were successfully compared with experimental PIV flow visualizations. The effect of the distance between the cooled wall and the heat exchanger on the temperature and velocity distributions was calculated. In a room with a simple shape, the computational code OpenFOAM demonstrated an ability to numerically predict flow patterns. Furthermore, numerical techniques, boundary type conditions and the computational grid quality were examined. Calculations using the turbulence model k-omega had a significant effect on the results influencing temperature and velocity distributions.Keywords: natural and forced convections, numerical simulations, indoor airflows.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3206984 A Dose Distribution Approach Using Monte Carlo Simulation in Dosimetric Accuracy Calculation for Treating the Lung Tumor
Authors: Md Abdullah Al Mashud, M. Tariquzzaman, M. Jahangir Alam, Tapan Kumar Godder, M. Mahbubur Rahman
Abstract:
This paper presents a Monte Carlo (MC) method-based dose distributions on lung tumor for 6 MV photon beam to improve the dosimetric accuracy for cancer treatment. The polystyrene which is tissue equivalent material to the lung tumor density is used in this research. In the empirical calculations, TRS-398 formalism of IAEA has been used, and the setup was made according to the ICRU recommendations. The research outcomes were compared with the state-of-the-art experimental results. From the experimental results, it is observed that the proposed based approach provides more accurate results and improves the accuracy than the existing approaches. The average %variation between measured and TPS simulated values was obtained 1.337±0.531, which shows a substantial improvement comparing with the state-of-the-art technology.
Keywords: Lung tumor, Monte Carlo, polystyrene, elekta synergy, Monaco Planning System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1242983 The Effect of Nonnormality on CB-SEM and PLS-SEM Path Estimates
Authors: Z. Jannoo, B. W. Yap, N. Auchoybur, M. A. Lazim
Abstract:
The two common approaches to Structural Equation Modeling (SEM) are the Covariance-Based SEM (CB-SEM) and Partial Least Squares SEM (PLS-SEM). There is much debate on the performance of CB-SEM and PLS-SEM for small sample size and when distributions are nonnormal. This study evaluates the performance of CB-SEM and PLS-SEM under normality and nonnormality conditions via a simulation. Monte Carlo Simulation in R programming language was employed to generate data based on the theoretical model with one endogenous and four exogenous variables. Each latent variable has three indicators. For normal distributions, CB-SEM estimates were found to be inaccurate for small sample size while PLS-SEM could produce the path estimates. Meanwhile, for a larger sample size, CB-SEM estimates have lower variability compared to PLS-SEM. Under nonnormality, CB-SEM path estimates were inaccurate for small sample size. However, CB-SEM estimates are more accurate than those of PLS-SEM for sample size of 50 and above. The PLS-SEM estimates are not accurate unless sample size is very large.
Keywords: CB-SEM, Monte Carlo simulation, Normality conditions, Nonnormality, PLS-SEM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5141982 Micromechanical Modeling of Fiber-Matrix Debonding in Unidirectional Composites
Authors: M. Palizvan, M. T. Abadi, M. H. Sadr
Abstract:
Due to variations in damage mechanisms in the microscale, the behavior of fiber-reinforced composites is nonlinear and difficult to model. To make use of computational advantages, homogenization method is applied to the micro-scale model in order to minimize the cost at the expense of detail of local microscale phenomena. In this paper, the effective stiffness is calculated using the homogenization of nonlinear behavior of a composite representative volume element (RVE) containing fiber-matrix debonding. The damage modes for the RVE are considered by using cohesive elements and contacts for the cohesive behavior of the interface between fiber and matrix. To predict more realistic responses of composite materials, different random distributions of fibers are proposed besides square and hexagonal arrays. It was shown that in some cases, there is quite different damage behavior in different fiber distributions. A comprehensive comparison has been made between different graphs.
Keywords: Homogenization, cohesive zone model, fiber-matrix debonding, RVE.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 787981 The Effects of Quality of Web-Based Applications on Competitive Advantage: An Empirical Study in Commercial Banks in Jordan
Authors: Faisal Asad Aburub
Abstract:
Many organizations are investing in web applications and technologies in order to be competitive, some of them could not achieve its goals. The quality of web-based applications could play an important role for organizations to be competitive. So the aim of this study is to investigate the impact of quality of web-based applications to achieve a competitive advantage. A new model has been developed. An empirical investigation was performed on a banking sector in Jordan to test the new model. The results show that impact of web-based applications on competitive advantage is significant. Finally, further work is planned to validate and evaluate the proposed model using several domains.
Keywords: Competitive advantage, web-based applications, empirical investigation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2061980 Electromagnetic Field Modeling in Human Tissue
Authors: Iliana Marinova, Valentin Mateev
Abstract:
For investigations of electromagnetic field distributions in biological structures by Finite Element Method (FEM), a method for automatic 3D model building of human anatomical objects is developed. Models are made by meshed structures and specific electromagnetic material properties for each tissue type. Mesh is built according to specific FEM criteria for achieving good solution accuracy. Several FEM models of anatomical objects are built. Formulation using magnetic vector potential and scalar electric potential (A-V, A) is used for modeling of electromagnetic fields in human tissue objects. The developed models are suitable for investigations of electromagnetic field distributions in human tissues exposed in external fields during magnetic stimulation, defibrillation, impedance tomography etc.Keywords: electromagnetic field, finite element method, humantissue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5295979 Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians
Authors: Aly Farag, Ayman El-Baz, Refaat Mohamed
Abstract:
In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.
Keywords: Logistic regression model, Expectationmaximization, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733978 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.
Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3311977 Time Series Simulation by Conditional Generative Adversarial Net
Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto
Abstract:
Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.
Keywords: Conditional Generative Adversarial Net, market and credit risk management, neural network, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1199976 The Performance of Predictive Classification Using Empirical Bayes
Authors: N. Deetae, S. Sukparungsee, Y. Areepong, K. Jampachaisri
Abstract:
This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.
Keywords: Classification, Empirical Bayes, Posterior predictive probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597975 An Optimal Unsupervised Satellite image Segmentation Approach Based on Pearson System and k-Means Clustering Algorithm Initialization
Authors: Ahmed Rekik, Mourad Zribi, Ahmed Ben Hamida, Mohamed Benjelloun
Abstract:
This paper presents an optimal and unsupervised satellite image segmentation approach based on Pearson system and k-Means Clustering Algorithm Initialization. Such method could be considered as original by the fact that it utilised K-Means clustering algorithm for an optimal initialisation of image class number on one hand and it exploited Pearson system for an optimal statistical distributions- affectation of each considered class on the other hand. Satellite image exploitation requires the use of different approaches, especially those founded on the unsupervised statistical segmentation principle. Such approaches necessitate definition of several parameters like image class number, class variables- estimation and generalised mixture distributions. Use of statistical images- attributes assured convincing and promoting results under the condition of having an optimal initialisation step with appropriated statistical distributions- affectation. Pearson system associated with a k-means clustering algorithm and Stochastic Expectation-Maximization 'SEM' algorithm could be adapted to such problem. For each image-s class, Pearson system attributes one distribution type according to different parameters and especially the Skewness 'β1' and the kurtosis 'β2'. The different adapted algorithms, K-Means clustering algorithm, SEM algorithm and Pearson system algorithm, are then applied to satellite image segmentation problem. Efficiency of those combined algorithms was firstly validated with the Mean Quadratic Error 'MQE' evaluation, and secondly with visual inspection along several comparisons of these unsupervised images- segmentation.
Keywords: Unsupervised classification, Pearson system, Satellite image, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2040974 Creating Maintenance Cost Model for University Buildings
Authors: AbdulLateef A. Olanrewaju, Arazi Idrus, Mohd F. Khamidi
Abstract:
Maintenance costs incurred on building differs. The difference can be as results of the types, functions, age, building health index, size, form height, location and complexity of the building. These are contributing to the difficulty in maintenance development of deterministic maintenance cost model. This paper is concerns with reporting the preliminary findings on the creation of building maintenance cost distributions for universities in Malaysia. This study is triggered by the need to provide guides on maintenance costs distributions for decision making. For this purpose, a survey questionnaire was conducted to investigate the distribution of maintenance costs in the universities. Altogether, responses were received from twenty universities comprising both private and publicly owned. The research found that engineering services, roofing and finishes were the elements contributing the larger segment of the maintenance costs. Furthermore, the study indicates the significance of maintenance cost distribution as decision making tool towards maintenance management.Keywords: Performance matrix, university buildings, costmodel, Malaysia
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2037973 Empirical Analytical Modelling of Average Bond Stress and Anchorage of Tensile Bars in Reinforced Concrete
Authors: Maruful H. Mazumder, Raymond I. Gilbert
Abstract:
The design specifications for calculating development and lapped splice lengths of reinforcement in concrete are derived from a conventional empirical modelling approach that correlates experimental test data using a single mathematical equation. This paper describes part of a recently completed experimental research program to assess the effects of different structural parameters on the development length requirements of modern high strength steel reinforcing bars, including the case of lapped splices in large-scale reinforced concrete members. The normalized average bond stresses for the different variations of anchorage lengths are assessed according to the general form of a typical empirical analytical model of bond and anchorage. Improved analytical modelling equations are developed in the paper that better correlate the normalized bond strength parameters with the structural parameters of an empirical model of bond and anchorage.
Keywords: Bond stress, Development length, Lapped splice length, Reinforced concrete.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2275972 Dempster-Shafer Evidence Theory for Image Segmentation: Application in Cells Images
Authors: S. Ben Chaabane, M. Sayadi, F. Fnaiech, E. Brassart
Abstract:
In this paper we propose a new knowledge model using the Dempster-Shafer-s evidence theory for image segmentation and fusion. The proposed method is composed essentially of two steps. First, mass distributions in Dempster-Shafer theory are obtained from the membership degrees of each pixel covering the three image components (R, G and B). Each membership-s degree is determined by applying Fuzzy C-Means (FCM) clustering to the gray levels of the three images. Second, the fusion process consists in defining three discernment frames which are associated with the three images to be fused, and then combining them to form a new frame of discernment. The strategy used to define mass distributions in the combined framework is discussed in detail. The proposed fusion method is illustrated in the context of image segmentation. Experimental investigations and comparative studies with the other previous methods are carried out showing thus the robustness and superiority of the proposed method in terms of image segmentation.Keywords: Fuzzy C-means, Color image, data fusion, Dempster-Shafer's evidence theory
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2200