Search results for: elemental graph data model
12597 A Competitive Replica Placement Methodology for Ad Hoc Networks
Authors: Samee Ullah Khan, C. Ardil
Abstract:
In this paper, a mathematical model for data object replication in ad hoc networks is formulated. The derived model is general, flexible and adaptable to cater for various applications in ad hoc networks. We propose a game theoretical technique in which players (mobile hosts) continuously compete in a non-cooperative environment to improve data accessibility by replicating data objects. The technique incorporates the access frequency from mobile hosts to each data object, the status of the network connectivity, and communication costs. The proposed technique is extensively evaluated against four well-known ad hoc network replica allocation methods. The experimental results reveal that the proposed approach outperforms the four techniques in both the execution time and solution qualityKeywords: Data replication, auctions, static allocation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 140212596 A Comparison of Grey Model and Fuzzy Predictive Model for Time Series
Authors: A. I. Dounis, P. Tiropanis, D. Tseles, G. Nikolaou, G. P. Syrcos
Abstract:
The prediction of meteorological parameters at a meteorological station is an interesting and open problem. A firstorder linear dynamic model GM(1,1) is the main component of the grey system theory. The grey model requires only a few previous data points in order to make a real-time forecast. In this paper, we consider the daily average ambient temperature as a time series and the grey model GM(1,1) applied to local prediction (short-term prediction) of the temperature. In the same case study we use a fuzzy predictive model for global prediction. We conclude the paper with a comparison between local and global prediction schemes.Keywords: Fuzzy predictive model, grey model, local andglobal prediction, meteorological forecasting, time series.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 215612595 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data
Authors: Wanhyun Cho, Soonja Kang, Sangkyoon Kim, Soonyoung Park
Abstract:
We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.Keywords: Multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, Importance sampling, approximate posterior distribution, Marginal likelihood evidence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161512594 Seamless Flow of Voluminous Data in High Speed Network without Congestion Using Feedback Mechanism
Abstract:
Continuously growing needs for Internet applications that transmit massive amount of data have led to the emergence of high speed network. Data transfer must take place without any congestion and hence feedback parameters must be transferred from the receiver end to the sender end so as to restrict the sending rate in order to avoid congestion. Even though TCP tries to avoid congestion by restricting the sending rate and window size, it never announces the sender about the capacity of the data to be sent and also it reduces the window size by half at the time of congestion therefore resulting in the decrease of throughput, low utilization of the bandwidth and maximum delay. In this paper, XCP protocol is used and feedback parameters are calculated based on arrival rate, service rate, traffic rate and queue size and hence the receiver informs the sender about the throughput, capacity of the data to be sent and window size adjustment, resulting in no drastic decrease in window size, better increase in sending rate because of which there is a continuous flow of data without congestion. Therefore as a result of this, there is a maximum increase in throughput, high utilization of the bandwidth and minimum delay. The result of the proposed work is presented as a graph based on throughput, delay and window size. Thus in this paper, XCP protocol is well illustrated and the various parameters are thoroughly analyzed and adequately presented.Keywords: Bandwidth-Delay Product, Congestion Control, Congestion Window, TCP/IP
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 148812593 Method of Estimating Absolute Entropy of Municipal Solid Waste
Authors: Francis Chinweuba Eboh, Peter Ahlström, Tobias Richards
Abstract:
Entropy, as an outcome of the second law of thermodynamics, measures the level of irreversibility associated with any process. The identification and reduction of irreversibility in the energy conversion process helps to improve the efficiency of the system. The entropy of pure substances known as absolute entropy is determined at an absolute reference point and is useful in the thermodynamic analysis of chemical reactions; however, municipal solid waste (MSW) is a structurally complicated material with unknown absolute entropy. In this work, an empirical model to calculate the absolute entropy of MSW based on the content of carbon, hydrogen, oxygen, nitrogen, sulphur, and chlorine on a dry ash free basis (daf) is presented. The proposed model was derived from 117 relevant organic substances which represent the main constituents in MSW with known standard entropies using statistical analysis. The substances were divided into different waste fractions; namely, food, wood/paper, textiles/rubber and plastics waste and the standard entropies of each waste fraction and for the complete mixture were calculated. The correlation of the standard entropy of the complete waste mixture derived was found to be somsw= 0.0101C + 0.0630H + 0.0106O + 0.0108N + 0.0155S + 0.0084Cl (kJ.K-1.kg) and the present correlation can be used for estimating the absolute entropy of MSW by using the elemental compositions of the fuel within the range of 10.3% ≤ C ≤ 95.1%, 0.0% ≤ H ≤ 14.3%, 0.0% ≤ O ≤ 71.1%, 0.0 ≤ N ≤ 66.7%, 0.0% ≤ S ≤ 42.1%, 0.0% ≤ Cl ≤ 89.7%. The model is also applicable for the efficient modelling of a combustion system in a waste-to-energy plant.
Keywords: Absolute entropy, irreversibility, municipal solid waste, waste-to-energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184012592 Multiphase Coexistence for Aqueous System with Hydrophilic Agent
Authors: G. B. Hong, H. W. Chen
Abstract:
Liquid-Liquid Equilibrium (LLE) data are measured for the ternary mixtures of water + 1-butanol + butyl acetate and quaternary mixtures of water + 1-butanol + butyl acetate + glycerol at atmospheric pressure at 313.15 K. In addition, isothermal vapor–liquid–liquid equilibrium (VLLE) data are determined experimentally at 333.15 K. The region of heterogeneity is found to increase as the hydrophilic agent (glycerol) is introduced into the aqueous mixtures. The experimental data are correlated with the NRTL model. The predicted results from the solution model with the model parameters determined from the constituent binaries are also compared with the experimental values.Keywords: LLE, VLLE, hydrophilic agent, NRTL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 128412591 Statistical Modeling of Constituents in Ash Evolved From Pulverized Coal Combustion
Authors: Esam Jassim
Abstract:
Industries using conventional fossil fuels have an interest in better understanding the mechanism of particulate formation during combustion since such is responsible for emission of undesired inorganic elements that directly impact the atmospheric pollution level. Fine and ultrafine particulates have tendency to escape the flue gas cleaning devices to the atmosphere. They also preferentially collect on surfaces in power systems resulting in ascending in corrosion inclination, descending in the heat transfer thermal unit, and severe impact on human health. This adverseness manifests particularly in the regions of world where coal is the dominated source of energy for consumption. This study highlights the behavior of calcium transformation as mineral grains verses organically associated inorganic components during pulverized coal combustion. The influence of existing type of calcium on the coarse, fine and ultrafine mode formation mechanisms is also presented. The impact of two sub-bituminous coals on particle size and calcium composition evolution during combustion is to be assessed. Three mixed blends named Blends 1, 2, and 3 are selected according to the ration of coal A to coal B by weight. Calcium percentage in original coal increases as going from Blend 1 to 3. A mathematical model and a new approach of describing constituent distribution are proposed. Analysis of experiments of calcium distribution in ash is also modeled using Poisson distribution. A novel parameter, called elemental index λ, is introduced as a measuring factor of element distribution. Results show that calcium in ash that originally in coal as mineral grains has index of 17, whereas organically associated calcium transformed to fly ash shown to be best described when elemental index λ is 7. As an alkaline-earth element, calcium is considered the fundamental element responsible for boiler deficiency since it is the major player in the mechanism of ash slagging process. The mechanism of particle size distribution and mineral species of ash particles are presented using CCSEM and size-segregated ash characteristics. Conclusions are drawn from the analysis of pulverized coal ash generated from a utility-scale boiler.
Keywords: Calcium transformation, Coal Combustion, Inorganic Element, Poisson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195712590 The Effect of the Hourly Compensation on the Unemployment Rate: Comparative Analysis of United States, Canada and the United Kingdom Using Panel Data Regression Analysis
Authors: Ashiquer Rahman, Hares Mohammad, Ummey Salma
Abstract:
A country’s hourly compensation and unemployment rates are two of its most crucial components. They are not merely statistics but they have profound effects on individual, families, country, and the economy. They are inversely related to one another. The increased hourly compensation in the manufacturing sector can have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, in order to determine the effect of hourly compensation on unemployment rate, we use the panel data regression models and evaluate the expected link between hourly compensation and unemployment rate. We estimate the fixed effects model (FEM), evaluate the error components model (ECM), and determine which model (the FEM or ECM) is better through pooling all 60 observations. We then analyze and review the data by comparing countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of this extensive research on how the hourly compensation affects unemployment rate. Additionally, this paper offers relevant and useful guideline for the government and academic community to use an econometrics and social approach for the hourly compensation on unemployment rate to eliminate the problem.
Keywords: Hourly compensation, unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7112589 Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data
Authors: Hyun-Woo Cho
Abstract:
Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.Keywords: Prediction, operation monitoring, on-line data, nonlinear statistical methods, empirical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 165812588 Case-Based Reasoning: A Hybrid Classification Model Improved with an Expert's Knowledge for High-Dimensional Problems
Authors: Bruno Trstenjak, Dzenana Donko
Abstract:
Data mining and classification of objects is the process of data analysis, using various machine learning techniques, which is used today in various fields of research. This paper presents a concept of hybrid classification model improved with the expert knowledge. The hybrid model in its algorithm has integrated several machine learning techniques (Information Gain, K-means, and Case-Based Reasoning) and the expert’s knowledge into one. The knowledge of experts is used to determine the importance of features. The paper presents the model algorithm and the results of the case study in which the emphasis was put on achieving the maximum classification accuracy without reducing the number of features.
Keywords: Case based reasoning, classification, expert's knowledge, hybrid model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 142012587 The Importance of 3D Mesh Generation for Large Eddy Simulation of Gas – Solid Turbulent Flows in a Fluidized Beds
Authors: G. González-Silva, E. M. Matos, W. P. Martignoni, M. Mori
Abstract:
The objective of this work is to show a procedure for mesh generation in a fluidized bed using large eddy simulations (LES) of a filtered two-fluid model. The experimental data were obtained by [1] in a laboratory fluidized bed. Results show that it is possible to use mesh with less cells as compared to RANS turbulence model with granular kinetic theory flow (KTGF). Also, the numerical results validate the experimental data near wall of the bed, which cannot be predicted by RANS.model.Keywords: LES, Mesh, Gas-Solid, Fluidized bed
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 212412586 Microbial Leaching Process to Recover Valuable Metals from Spent Petroleum Catalyst Using Iron Oxidizing Bacteria
Authors: Debabrata Pradhan, Dong J. Kim, Jong G. Ahn, Seoung W. Lee
Abstract:
Spent petroleum catalyst from Korean petrochemical industry contains trace amount of metals such as Ni, V and Mo. Therefore an attempt was made to recover those trace metal using bioleaching process. Different leaching parameters such as Fe(II) concentration, pulp density, pH, temperature and particle size of spent catalyst particle were studied to evaluate their effects on the leaching efficiency. All the three metal ions like Ni, V and Mo followed dual kinetics, i.e., initial faster followed by slower rate. The percentage of leaching efficiency of Ni and V were higher than Mo. The leaching process followed a diffusion controlled model and the product layer was observed to be impervious due to formation of ammonium jarosite (NH4)Fe3(SO4)2(OH)6. In addition, the lower leaching efficiency of Mo was observed due to a hydrophobic coating of elemental sulfur over Mo matrix in the spent catalyst.Keywords: Bioleaching, diffusion control, shrinking core, spentpetroleum catalyst.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 202112585 Behavioral Modeling Accuracy for RF Power Amplifier with Memory Effects
Authors: Chokri Jebali, Noureddine Boulejfen, Ali Gharsallah, Fadhel M. Ghannouchi
Abstract:
In this paper, a system level behavioural model for RF power amplifier, which exhibits memory effects, and based on multibranch system is proposed. When higher order terms are included, the memory polynomial model (MPM) exhibits numerical instabilities. A set of memory orthogonal polynomial model (OMPM) is introduced to alleviate the numerical instability problem associated to MPM model. A data scaling and centring algorithm was applied to improve the power amplifier modeling accuracy. Simulation results prove that the numerical instability can be greatly reduced, as well as the model precision improved with nonlinear model.Keywords: power amplifier, orthogonal model, polynomialmodel , memory effects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 227712584 A Model Predictive Control and Time Series Forecasting Framework for Supply Chain Management
Authors: Philip Doganis, Eleni Aggelogiannaki, Haralambos Sarimveis
Abstract:
Model Predictive Control has been previously applied to supply chain problems with promising results; however hitherto proposed systems possessed no information on future demand. A forecasting methodology will surely promote the efficiency of control actions by providing insight on the future. A complete supply chain management framework that is based on Model Predictive Control (MPC) and Time Series Forecasting will be presented in this paper. The proposed framework will be tested on industrial data in order to assess the efficiency of the method and the impact of forecast accuracy on overall control performance of the supply chain. To this end, forecasting methodologies with different characteristics will be implemented on test data to generate forecasts that will serve as input to the Model Predictive Control module.Keywords: Forecasting, Model predictive control, production planning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197612583 MMU Simulation in Hardware Simulator Based-on State Transition Models
Authors: Zhang Xiuping, Yang Guowu, Zheng Desheng
Abstract:
Embedded hardware simulator is a valuable computeraided tool for embedded application development. This paper focuses on the ARM926EJ-S MMU, builds state transition models and formally verifies critical properties for the models. The state transition models include loading instruction model, reading data model, and writing data model. The properties of the models are described by CTL specification language, and they are verified in VIS. The results obtained in VIS demonstrate that the critical properties of MMU are satisfied in the state transition models. The correct models can be used to implement the MMU component in our simulator. In the end of this paper, the experimental results show that the MMU can successfully accomplish memory access requests from CPU.Keywords: MMU, State transition, Model, Simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161712582 Removal of Elemental Mercury from Dry Methane Gas with Manganese Oxides
Authors: Junya Takenami, Md. Azhar Uddin, Eiji Sasaoka, Yasushi Shioya, Tsuneyoshi Takase
Abstract:
In this study, we sought to investigate the mercury removal efficiency of manganese oxides from natural gas. The fundamental studies on mercury removal with manganese oxides sorbents were carried out in a laboratory scale fixed bed reactor at 30 °C with a mixture of methane (20%) and nitrogen gas laden with 4.8 ppb of elemental mercury. Manganese oxides with varying surface area and crystalline phase were prepared by conventional precipitation method in this study. The effects of surface area, crystallinity and other metal oxides on mercury removal efficiency were investigated. Effect of Ag impregnation on mercury removal efficiency was also investigated. Ag supported on metal oxide such titania and zirconia as reference materials were also used in this study for comparison. The characteristics of mercury removal reaction with manganese oxide was investigated using a temperature programmed desorption (TPD) technique. Manganese oxides showed very high Hg removal activity (about 73-93% Hg removal) for first time use. Surface area of the manganese oxide samples decreased after heat-treatment and resulted in complete loss of Hg removal ability for repeated use after Hg desorption in the case of amorphous MnO2, and 75% loss of the initial Hg removal activity for the crystalline MnO2. Mercury desorption efficiency of crystalline MnO2 was very low (37%) for first time use and high (98%) after second time use. Residual potassium content in MnO2 may have some effect on the thermal stability of the adsorbed Hg species. Desorption of Hg from manganese oxides occurs at much higher temperatures (with a peak at 400 °C) than Ag/TiO2 or Ag/ZrO2. Mercury may be captured on manganese oxides in the form of mercury manganese oxide.Keywords: Mercury removal, Metal and metal oxide sorbents, Methane, Natural gas.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210412581 A Hybrid DEA Model for the Measurement of the Enviromental Performance
Authors: A. Hadi-Vencheh, N. Shayesteh Moghadam
Abstract:
Data envelopment analysis (DEA) has gained great popularity in environmental performance measurement because it can provide a synthetic standardized environmental performance index when pollutants are suitably incorporated into the traditional DEA framework. Since some of the environmental performance indicators cannot be controlled by companies managers, it is necessary to develop the model in a way that it could be applied when discretionary and/or non-discretionary factors were involved. In this paper, we present a semi-radial DEA approach to measuring environmental performance, which consists of non-discretionary factors. The model, then, has been applied on a real case.
Keywords: Environmental performance, efficiency, non-discretionary variables, data envelopment analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137712580 Application of Scanning Electron Microscopy and X-Ray Evaluation of the Main Digestion Methods for Determination of Macroelements in Plant Tissue
Authors: Krasimir I. Ivanov, Penka S. Zapryanova, Stefan V. Krustev, Violina R. Angelova
Abstract:
Three commonly used digestion methods (dry ashing, acid digestion, and microwave digestion) in different variants were compared for digestion of tobacco leaves. Three main macroelements (K, Ca and Mg) were analysed using AAS Spectrometer Spectra АА 220, Varian, Australia. The accuracy and precision of the measurements were evaluated by using Polish reference material CTR-VTL-2 (Virginia tobacco leaves). To elucidate the problems with elemental recovery X-Ray and SEM–EDS analysis of all residues after digestion were performed. The X-ray investigation showed a formation of KClO4 when HClO4 was used as a part of the acids mixture. The use of HF at Ca and Mg determination led to the formation of CaF2 and MgF2. The results were confirmed by energy dispersive X-ray microanalysis. SPSS program for Windows was used for statistical data processing.
Keywords: Digestion methods, determination of macroelements, plant tissue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94012579 Forecasting Materials Demand from Multi-Source Ordering
Authors: Hui Hsin Huang
Abstract:
The downstream manufactures will order their materials from different upstream suppliers to maintain a certain level of the demand. This paper proposes a bivariate model to portray this phenomenon of material demand. We use empirical data to estimate the parameters of model and evaluate the RMSD of model calibration. The results show that the model has better fitness.
Keywords: Farlie-Gumbel-Morgenstern family of bivariate distributions, multi-source ordering, materials demand quantity, recency, ordering time.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94912578 Transformation of the Business Model in an Occupational Health Care Company Embedded in an Emerging Personal Data Ecosystem: A Case Study in Finland
Authors: Tero Huhtala, Minna Pikkarainen, Saila Saraniemi
Abstract:
Information technology has long been used as an enabler of exchange for goods and services. Services are evolving from generic to personalized, and the reverse use of customer data has been discussed in both academia and industry for the past few years. This article presents the results of an empirical case study in the area of preventive health care services. The primary data were gathered in workshops, in which future personal data-based services were conceptualized by analyzing future scenarios from a business perspective. The aim of this study is to understand business model transformation in emerging personal data ecosystems. The work was done as a case study in the context of occupational healthcare. The results have implications to theory and practice, indicating that adopting personal data management principles requires transformation of the business model, which, if successfully managed, may provide access to more resources, potential to offer better value, and additional customer channels. These advantages correlate with the broadening of the business ecosystem. Expanding the scope of this study to include more actors would improve the validity of the research. The results draw from existing literature and are based on findings from a case study and the economic properties of the healthcare industry in Finland.
Keywords: Ecosystem, business model, personal data, preventive healthcare.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 114112577 A Study of Mode Choice Model Improvement Considering Age Grouping
Authors: Young-Hyun Seo, Hyunwoo Park, Dong-Kyu Kim, Seung-Young Kho
Abstract:
The purpose of this study is providing an improved mode choice model considering parameters including age grouping of prime-aged and old age. In this study, 2010 Household Travel Survey data were used and improper samples were removed through the analysis. Chosen alternative, date of birth, mode, origin code, destination code, departure time, and arrival time are considered from Household Travel Survey. By preprocessing data, travel time, travel cost, mode, and ratio of people aged 45 to 55 years, 55 to 65 years and over 65 years were calculated. After the manipulation, the mode choice model was constructed using LIMDEP by maximum likelihood estimation. A significance test was conducted for nine parameters, three age groups for three modes. Then the test was conducted again for the mode choice model with significant parameters, travel cost variable and travel time variable. As a result of the model estimation, as the age increases, the preference for the car decreases and the preference for the bus increases. This study is meaningful in that the individual and households characteristics are applied to the aggregate model.
Keywords: Age grouping, aging, mode choice model, multinomial logit model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 161412576 Robust Regression and its Application in Financial Data Analysis
Authors: Mansoor Momeni, Mahmoud Dehghan Nayeri, Ali Faal Ghayoumi, Hoda Ghorbani
Abstract:
This research is aimed to describe the application of robust regression and its advantages over the least square regression method in analyzing financial data. To do this, relationship between earning per share, book value of equity per share and share price as price model and earning per share, annual change of earning per share and return of stock as return model is discussed using both robust and least square regressions, and finally the outcomes are compared. Comparing the results from the robust regression and the least square regression shows that the former can provide the possibility of a better and more realistic analysis owing to eliminating or reducing the contribution of outliers and influential data. Therefore, robust regression is recommended for getting more precise results in financial data analysis.
Keywords: Financial data analysis, Influential data, Outliers, Robust regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193212575 The Establishment of RELAP5/SNAP Model for Kuosheng Nuclear Power Plant
Authors: C. Shih, J. R. Wang, H. C. Chang, S. W. Chen, S. C. Chiang, T. Y. Yu
Abstract:
After the measurement uncertainty recapture (MUR) power uprates, Kuosheng nuclear power plant (NPP) was uprated the power from 2894 MWt to 2943 MWt. For power upgrade, several codes (e.g., TRACE, RELAP5, etc.) were applied to assess the safety of Kuosheng NPP. Hence, the main work of this research is to establish a RELAP5/MOD3.3 model of Kuosheng NPP with SNAP interface. The establishment of RELAP5/SNAP model was referred to the FSAR, training documents, and TRACE model which has been developed and verified before. After completing the model establishment, the startup test scenarios would be applied to the RELAP5/SNAP model. With comparing the startup test data and TRACE analysis results, the applicability of RELAP5/SNAP model would be assessed.
Keywords: RELAP5, TRACE, SNAP, BWR.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 118512574 A Comparison of the Sum of Squares in Linear and Partial Linear Regression Models
Authors: Dursun Aydın
Abstract:
In this paper, estimation of the linear regression model is made by ordinary least squares method and the partially linear regression model is estimated by penalized least squares method using smoothing spline. Then, it is investigated that differences and similarity in the sum of squares related for linear regression and partial linear regression models (semi-parametric regression models). It is denoted that the sum of squares in linear regression is reduced to sum of squares in partial linear regression models. Furthermore, we indicated that various sums of squares in the linear regression are similar to different deviance statements in partial linear regression. In addition to, coefficient of the determination derived in linear regression model is easily generalized to coefficient of the determination of the partial linear regression model. For this aim, it is made two different applications. A simulated and a real data set are considered to prove the claim mentioned here. In this way, this study is supported with a simulation and a real data example.Keywords: Partial Linear Regression Model, Linear RegressionModel, Residuals, Deviance, Smoothing Spline.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 187312573 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data
Authors: M. A. Meslem
Abstract:
For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.
Keywords: Quasigeoid, gravity anomalies, covariance, GGM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 89812572 A Martingale Residual Diagnostic for Logistic Regression Model
Authors: Entisar A. Elgmati
Abstract:
Martingale model diagnostic for assessing the fit of logistic regression model to recurrent events data are studied. One way of assessing the fit is by plotting the empirical standard deviation of the standardized martingale residual processes. Here we used another diagnostic plot based on martingale residual covariance. We investigated the plot performance under several types of model misspecification. Clearly the method has correctly picked up the wrong model. Also we present a test statistic that supplement the inspection of the two diagnostic. The test statistic power agrees with what we have seen in the plots of the estimated martingale covariance.
Keywords: Covariance, logistic model, misspecification, recurrent events.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188012571 Intrusion Detection based on Distance Combination
Authors: Joffroy Beauquier, Yongjie Hu
Abstract:
The intrusion detection problem has been frequently studied, but intrusion detection methods are often based on a single point of view, which always limits the results. In this paper, we introduce a new intrusion detection model based on the combination of different current methods. First we use a notion of distance to unify the different methods. Second we combine these methods using the Pearson correlation coefficients, which measure the relationship between two methods, and we obtain a combined distance. If the combined distance is greater than a predetermined threshold, an intrusion is detected. We have implemented and tested the combination model with two different public data sets: the data set of masquerade detection collected by Schonlau & al., and the data set of program behaviors from the University of New Mexico. The results of the experiments prove that the combination model has better performances.
Keywords: Intrusion detection, combination, distance, Pearson correlation coefficients.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 184212570 Extreme Temperature Forecast in Mbonge, Cameroon through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution
Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph
Abstract:
In this paper, temperature extremes are forecast by employing the block maxima method of the Generalized extreme value(GEV) distribution to analyse temperature data from the Cameroon Development Corporation (C.D.C). By considering two sets of data (Raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data while in the simulated data, the return values show an increasing trend but with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend but with an upper bound. This clearly shows that temperatures in the tropics even-though show a sign of increasing in the future, there is a maximum temperature at which there is no exceedence. The results of this paper are very vital in Agricultural and Environmental research.Keywords: Return level, Generalized extreme value (GEV), Meteorology, Forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 210612569 The Establishment and Application of TRACE/FRAPTRAN Model for Kuosheng Nuclear Power Plant
Authors: S. W. Chen, W. K. Lin, J. R. Wang, C. Shih, H. T. Lin, H. C. Chang, W. Y. Li
Abstract:
Kuosheng nuclear power plant (NPP) is a BWR/6 type NPP and located on the northern coast of Taiwan. First, Kuosheng NPP TRACE model were developed in this research. In order to assess the system response of Kuosheng NPP TRACE model, startup tests data were used to evaluate Kuosheng NPP TRACE model. Second, the overpressurization transient analysis of Kuosheng NPP TRACE model was performed. Besides, in order to confirm the mechanical property and integrity of fuel rods, FRAPTRAN analysis was also performed in this study.
Keywords: TRACE, Safety analysis, BWR/6, FRAPTRAN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 218512568 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encryption
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Moses Noel Dogonyaro
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy, confidentiality, availability of the users. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory and algebra that can easily be integrated and leveraged in the Cloud computing with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based cryptographic security algorithm.
Keywords: Data Analytics, Security, Privacy, Bootstrapping, and Fully Homomorphic Encryption Scheme.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3458