Search results for: component model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8129

Search results for: component model

8039 Numerical Simulation of the Bond Behavior between Concrete and Steel Reinforcing Bars in Specialty Concrete

Authors: Camille A. Issa, Omar Masri

Abstract:

In this study, the commercial finite element software ABAQUS was used to develop a three-dimensional nonlinear finite element model capable of simulating the pull-out test of reinforcing bars from underwater concrete. The results of thirty-two pull-out tests that have different parameters were implemented in the software to study the effect of the concrete cover, the bar size, the use of stirrups, and the compressive strength of concrete. The interaction properties used in the model provided accurate results in comparison with the experimental bond-slip results, thus the model has successfully simulated the pull-out test. The results of the finite element model are used to better understand and visualize the distribution of stresses in each component of the model, and to study the effect of the various parameters used in this study including the role of the stirrups in preventing the stress from reaching to the sides of the specimens.

Keywords: Bond strength, nonlinear finite element analysis, pull-out test, underwater concrete.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4613
8038 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.

Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 165
8037 A Comprehensive Approach in Calculating the Impact of the Ground on Radiated Electromagnetic Fields Due to Lightning

Authors: Lahcene Boukelkoul

Abstract:

The influence of finite ground conductivity is of great importance in calculating the induced voltages from the radiated electromagnetic fields due to lightning. In this paper, we try to give a comprehensive approach to calculate the impact of the ground on the radiated electromagnetic fields to lightning. The vertical component of lightning electric field is calculated with a reasonable approximation assuming a perfectly conducting ground in case the observation point does not exceed a few kilometers from the lightning channel. However, for distant observation points the radiated vertical component of lightning electric field is attenuated due finitely conducting ground. The attenuation is calculated using the expression elaborated for both low and high frequencies. The horizontal component of the electric field, however, is more affected by a finite conductivity of a ground. Besides, the contribution of the horizontal component of the electric field, to induced voltages on an overhead transmission line, is greater than that of the vertical component. Therefore, the calculation of the horizontal electric field is great concern for the simulation of lightning-induced voltages. For field to transmission lines coupling the ground impedance is calculated for early time behavior and for low frequency range.

Keywords: Ground impedance, horizontal electric field, lightning, transient propagation, vertical electric field.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1874
8036 Tongue Diagnosis System Based on PCA and SVM

Authors: Jin-Woong Park, Sun-Kyung Kang, Sung-Tae Jung

Abstract:

In this study, we propose a tongue diagnosis method which detects the tongue from face image and divides the tongue area into six areas, and finally generates tongue coating ratio of each area. To detect the tongue area from face image, we use ASM as one of the active shape models. Detected tongue area is divided into six areas widely used in the Korean traditional medicine and the distribution of tongue coating of the six areas is examined by SVM(Support Vector Machine). For SVM, we use a 3-dimensional vector calculated by PCA(Principal Component Analysis) from a 12-dimentional vector consisting of RGB, HIS, Lab, and Luv. As a result, we detected the tongue area stably using ASM and found that PCA and SVM helped raise the ratio of tongue coating detection.

Keywords: Active Shape Model, Principal Component Analysis, Support Vector Machine, Tongue diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
8035 Component Lifecycle and Concurrency Model in Usage Control (UCON) System

Authors: P. Ghann, J. Shiguang, C. Zhou

Abstract:

Access control is one of the most challenging issues facing information security. Access control is defined as, the ability to permit or deny access to a particular computational resource or digital information by an unauthorized user or subject. The concept of usage control (UCON) has been introduced as a unified approach to capture a number of extensions for access control models and systems. In UCON, an access decision is determined by three factors: authorizations, obligations and conditions. Attribute mutability and decision continuity are two distinct characteristics introduced by UCON for the first time. An observation of UCON components indicates that, the components are predefined and static. In this paper, we propose a new and flexible model of usage control for the creation and elimination of some of these components; for example new objects, subjects, attributes and integrate these with the original UCON model. We also propose a model for concurrent usage scenarios in UCON.

Keywords: Access Control, Concurrency, Digital container, Usage control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1849
8034 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770
8033 A Bathtub Curve from Nonparametric Model

Authors: Eduardo C. Guardia, Jose W. M. Lima, Afonso H. M. Santos

Abstract:

This paper presents a nonparametric method to obtain the hazard rate “Bathtub curve” for power system components. The model is a mixture of the three known phases of a component life, the decreasing failure rate (DFR), the constant failure rate (CFR) and the increasing failure rate (IFR) represented by three parametric Weibull models. The parameters are obtained from a simultaneous fitting process of the model to the Kernel nonparametric hazard rate curve. From the Weibull parameters and failure rate curves the useful lifetime and the characteristic lifetime were defined. To demonstrate the model the historic time-to-failure of distribution transformers were used as an example. The resulted “Bathtub curve” shows the failure rate for the equipment lifetime which can be applied in economic and replacement decision models.

Keywords: Bathtub curve, failure analysis, lifetime estimation, parameter estimation, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2234
8032 LQG Flight Control of VTAV for Enhanced Situational Awareness

Authors: Igor Astrov, Mikhail Pikkov, Rein Paluoja

Abstract:

This paper focuses on a critical component of the situational awareness (SA), the control of autonomous vertical flight for vectored thrust aerial vehicle (VTAV). With the SA strategy, we proposed a linear-quadratic-Gaussian (LQG) flight control procedure for an unmanned helicopter model with vectored thrust configuration. This LQG control for chosen model of VTAV has been verified by simulation of take-off and landing maneuvers using software package Simulink and demonstrated good performance for fast flight stabilization of model, consequently, fast SA with economy in energy can be asserted during search-and-rescue operations.

Keywords: Linear-Quadratic-Gaussian (LQG) controller, situational awareness, vectored thrust aerial vehicle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833
8031 Automatic Image Alignment and Stitching of Medical Images with Seam Blending

Authors: Abhinav Kumar, Raja Sekhar Bandaru, B Madhusudan Rao, Saket Kulkarni, Nilesh Ghatpande

Abstract:

This paper proposes an algorithm which automatically aligns and stitches the component medical images (fluoroscopic) with varying degrees of overlap into a single composite image. The alignment method is based on similarity measure between the component images. As applied here the technique is intensity based rather than feature based. It works well in domains where feature based methods have difficulty, yet more robust than traditional correlation. Component images are stitched together using the new triangular averaging based blending algorithm. The quality of the resultant image is tested for photometric inconsistencies and geometric misalignments. This method cannot correct rotational, scale and perspective artifacts.

Keywords: Histogram Matching, Image Alignment, ImageStitching, Medical Imaging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3760
8030 Modeling of Cross Flow Classifier with Water Injection

Authors: E. Pikushchak, J. Dueck, L. Minkov

Abstract:

In hydrocyclones, the particle separation efficiency is limited by the suspended fine particles, which are discharged with the coarse product in the underflow. It is well known that injecting water in the conical part of the cyclone reduces the fine particle fraction in the underflow. This paper presents a mathematical model that simulates the water injection in the conical component. The model accounts for the fluid flow and the particle motion. Particle interaction, due to hindered settling caused by increased density and viscosity of the suspension, and fine particle entrainment by settling coarse particles are included in the model. Water injection in the conical part of the hydrocyclone is performed to reduce fine particle discharge in the underflow. The model demonstrates the impact of the injection rate, injection velocity, and injection location on the shape of the partition curve. The simulations are compared with experimental data of a 50-mm cyclone.

Keywords: Classification, fine particle processing, hydrocyclone, water injection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
8029 An Optimization Analysis on an Automotive Component with Fatigue Constraint Using HyperWorks Software for Environmental Sustainability

Authors: W. M. Wan Muhamad, E. Sujatmika, M.R. Idris, S.A. Syed Ahmad

Abstract:

A finite element analysis (FEA) computer software HyperWorks is utilized in re-designing an automotive component to reduce its mass. Reduction of components mass contributes towards environmental sustainability by saving world-s valuable metal resources and by reducing carbon emission through improved overall vehicle fuel efficiency. A shape optimization analysis was performed on a rear spindle component. Pre-processing and solving procedures were performed using HyperMesh and RADIOSS respectively. Shape variables were defined using HyperMorph. Then optimization solver OptiStruct was utilized with fatigue life set as a design constraint. Since Stress-Number of Cycle (S-N) theory deals with uni-axial stress, the Signed von Misses stress on the component was used for looking up damage on S-N curve, and Gerber criterion for mean stress corrections. The optimization analysis resulted in mass reduction of 24% of the original mass. The study proved that the adopted approach has high potential use for environmental sustainability.

Keywords: Environmental Sustainability, Shape Optimization, Fatigue, Rear Spindle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4291
8028 Principal Component Analysis using Singular Value Decomposition of Microarray Data

Authors: Dong Hoon Lim

Abstract:

A series of microarray experiments produces observations of differential expression for thousands of genes across multiple conditions. Principal component analysis(PCA) has been widely used in multivariate data analysis to reduce the dimensionality of the data in order to simplify subsequent analysis and allow for summarization of the data in a parsimonious manner. PCA, which can be implemented via a singular value decomposition(SVD), is useful for analysis of microarray data. For application of PCA using SVD we use the DNA microarray data for the small round blue cell tumors(SRBCT) of childhood by Khan et al.(2001). To decide the number of components which account for sufficient amount of information we draw scree plot. Biplot, a graphic display associated with PCA, reveals important features that exhibit relationship between variables and also the relationship of variables with observations.

Keywords: Principal component analysis, singular value decomposition, microarray data, SRBCT

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3250
8027 Early Supplier Involvement in New Product Development: A Casting-Network Collaboration Model

Authors: Taneli Eisto, Venlakaisa Hölttä, Katrine Mahlamäki, Janne Kollanus, Marko Nieminen

Abstract:

Early supplier involvement (ESI) benefits new product development projects several ways. Nevertheless, many castuser companies do not know the advantages of ESI and therefore do not utilize it. This paper presents reasons why to utilize ESI in casting industry and how that can be done. Further, this paper presents advantages and challenges related to ESI in casting industry, and introduces a Casting-Network Collaboration Model. The model presents practices for companies to build advantageous collaborative relationships. More detailed, the model describes three levels for company-network relationships in casting industry with different degrees of collaboration, and requirements for operating in each level. In our research, ESI was found to influence, for example, on project time, component cost, and quality. In addition, challenges related to ESI, such as, a lack of mutual trust and unawareness about the advantages were found. Our research approach was a case study including four cases.

Keywords: Casting Industry, Collaboration Model, EarlySupplier Involvement, New Product Development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8475
8026 Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

Authors: Amir Hajian, Sepehr Damavandinejadmonfared

Abstract:

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Keywords: Biometrics, finger vein recognition, Principal Component Analysis (PCA), Kernel Principal Component Analysis (KPCA).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1962
8025 Independent Component Analysis to Mass Spectra of Aluminium Sulphate

Authors: M. Heikkinen, A. Sarpola, H. Hellman, J. Rämö, Y. Hiltunen

Abstract:

Independent component analysis (ICA) is a computational method for finding underlying signals or components from multivariate statistical data. The ICA method has been successfully applied in many fields, e.g. in vision research, brain imaging, geological signals and telecommunications. In this paper, we apply the ICA method to an analysis of mass spectra of oligomeric species emerged from aluminium sulphate. Mass spectra are typically complex, because they are linear combinations of spectra from different types of oligomeric species. The results show that ICA can decomposite the spectral components for useful information. This information is essential in developing coagulation phases of water treatment processes.

Keywords: Independent component analysis, massspectroscopy, water treatment, aluminium sulphate.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370
8024 Dynamic Response of Wind Turbines to Theoretical 3D Seismic Motions Taking into Account the Rotational Component

Authors: L. Hermanns, M.A. Santoyo, L. E. Quirós, J. Vega, J. M. Gaspar-Escribano, B. Benito

Abstract:

We study the dynamic response of a wind turbine structure subjected to theoretical seismic motions, taking into account the rotational component of ground shaking. Models are generated for a shallow moderate crustal earthquake in the Madrid Region (Spain). Synthetic translational and rotational time histories are computed using the Discrete Wavenumber Method, assuming a point source and a horizontal layered earth structure. These are used to analyze the dynamic response of a wind turbine, represented by a simple finite element model. Von Mises stress values at different heights of the tower are used to study the dynamical structural response to a set of synthetic ground motion time histories

Keywords: Synthetic seismograms, rotations, wind turbine, dynamic structural response

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1323
8023 Vector Control Using Series Iron Loss Model of Induction, Motors and Power Loss Minimization

Authors: Kheldoun Aissa, Khodja Djalal Eddine

Abstract:

The iron loss is a source of detuning in vector controlled induction motor drives if the classical rotor vector controller is used for decoupling. In fact, the field orientation will not be satisfied and the output torque will not truck the reference torque mostly used by Loss Model Controllers (LMCs). In addition, this component of loss, among others, may be excessive if the vector controlled induction motor is driving light loads. In this paper, the series iron loss model is used to develop a vector controller immune to iron loss effect and then an LMC to minimize the total power loss using the torque generated by the speed controller.

Keywords: Field Oriented Controller, Induction Motor, Loss ModelController, Series Iron Loss.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2702
8022 Concept, Design and Implementation of Power System Component Simulator Based on Thyristor Controlled Transformer and Power Converter

Authors: B. Kędra, R. Małkowski

Abstract:

This paper presents information on Power System Component Simulator – a device designed for LINTE^2 laboratory owned by Gdansk University of Technology in Poland. In this paper, we first provide an introductory information on the Power System Component Simulator and its capabilities. Then, the concept of the unit is presented. Requirements for the unit are described as well as proposed and introduced functions are listed. Implementation details are given. Hardware structure is presented and described. Information about used communication interface, data maintenance and storage solution, as well as used Simulink real-time features are presented. List and description of all measurements is provided. Potential of laboratory setup modifications is evaluated. Lastly, the results of experiments performed using Power System Component Simulator are presented. This includes simulation of under frequency load shedding, frequency and voltage dependent characteristics of groups of load units, time characteristics of group of different load units in a chosen area.

Keywords: Power converter, Simulink real-time, MATLAB, load, tap controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
8021 Face Recognition Using Principal Component Analysis, K-Means Clustering, and Convolutional Neural Network

Authors: Zukisa Nante, Wang Zenghui

Abstract:

Face recognition is the problem of identifying or recognizing individuals in an image. This paper investigates a possible method to bring a solution to this problem. The method proposes an amalgamation of Principal Component Analysis (PCA), K-Means clustering, and Convolutional Neural Network (CNN) for a face recognition system. It is trained and evaluated using the ORL dataset. This dataset consists of 400 different faces with 40 classes of 10 face images per class. Firstly, PCA enabled the usage of a smaller network. This reduces the training time of the CNN. Thus, we get rid of the redundancy and preserve the variance with a smaller number of coefficients. Secondly, the K-Means clustering model is trained using the compressed PCA obtained data which select the K-Means clustering centers with better characteristics. Lastly, the K-Means characteristics or features are an initial value of the CNN and act as input data. The accuracy and the performance of the proposed method were tested in comparison to other Face Recognition (FR) techniques namely PCA, Support Vector Machine (SVM), as well as K-Nearest Neighbour (kNN). During experimentation, the accuracy and the performance of our suggested method after 90 epochs achieved the highest performance: 99% accuracy F1-Score, 99% precision, and 99% recall in 463.934 seconds. It outperformed the PCA that obtained 97% and KNN with 84% during the conducted experiments. Therefore, this method proved to be efficient in identifying faces in the images.

Keywords: Face recognition, Principal Component Analysis, PCA, Convolutional Neural Network, CNN, Rectified Linear Unit, ReLU, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 505
8020 CFD Simulation of SO2 Removal from Gas Mixtures using Ceramic Membranes

Authors: Azam Marjani, Saeed Shirazian

Abstract:

This work deals with modeling and simulation of SO2 removal in a ceramic membrane by means of FEM. A mass transfer model was developed to predict the performance of SO2 absorption in a chemical solvent. The model was based on solving conservation equations for gas component in the membrane. Computational fluid dynamics (CFD) of mass and momentum were used to solve the model equations. The simulations aimed to obtain the distribution of gas concentration in the absorption process. The effect of the operating parameters on the efficiency of the ceramic membrane was evaluated. The modeling findings showed that the gas phase velocity has significant effect on the removal of gas whereas the liquid phase does not affect the SO2 removal significantly. It is also indicated that the main mass transfer resistance is placed in the membrane and gas phase because of high tortuosity of the ceramic membrane.

Keywords: Gas separation, finite element, ceramic, sulphur dioxide, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2280
8019 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and Validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) where in the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation & Control design team. This paper discusses about the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), Steady State, Transient State.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2518
8018 A New Analytical Approach to Reconstruct Residual Stresses Due to Turning Process

Authors: G.H. Farrahi, S.A. Faghidian, D.J. Smith

Abstract:

A thin layer on the component surface can be found with high tensile residual stresses, due to turning operations, which can dangerously affect the fatigue performance of the component. In this paper an analytical approach is presented to reconstruct the residual stress field from a limited incomplete set of measurements. Airy stress function is used as the primary unknown to directly solve the equilibrium equations and satisfying the boundary conditions. In this new method there exists the flexibility to impose the physical conditions that govern the behavior of residual stress to achieve a meaningful complete stress field. The analysis is also coupled to a least squares approximation and a regularization method to provide stability of the inverse problem. The power of this new method is then demonstrated by analyzing some experimental measurements and achieving a good agreement between the model prediction and the results obtained from residual stress measurement.

Keywords: Residual stress, Limited measurements, Inverse problems, Turning process.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1430
8017 An Evaluation of Solubility of Wax and Asphaltene in Crude Oil for Improved Flow Properties Using a Copolymer Solubilized in Organic Solvent with an Aromatic Hydrocarbon

Authors: S. M. Anisuzzaman, Sariah Abang, Awang Bono, D. Krishnaiah, N. M. Ismail, G. B. Sandrison

Abstract:

Wax and asphaltene are high molecular weighted compounds that contribute to the stability of crude oil at a dispersed state. Transportation of crude oil along pipelines from the oil rig to the refineries causes fluctuation of temperature which will lead to the coagulation of wax and flocculation of asphaltenes. This paper focuses on the prevention of wax and asphaltene precipitate deposition on the inner surface of the pipelines by using a wax inhibitor and an asphaltene dispersant. The novelty of this prevention method is the combination of three substances; a wax inhibitor dissolved in a wax inhibitor solvent and an asphaltene solvent, namely, ethylene-vinyl acetate (EVA) copolymer dissolved in methylcyclohexane (MCH) and toluene (TOL) to inhibit the precipitation and deposition of wax and asphaltene. The objective of this paper was to optimize the percentage composition of each component in this inhibitor which can maximize the viscosity reduction of crude oil. The optimization was divided into two stages which are the laboratory experimental stage in which the viscosity of crude oil samples containing inhibitor of different component compositions is tested at decreasing temperatures and the data optimization stage using response surface methodology (RSM) to design an optimizing model. The results of experiment proved that the combination of 50% EVA + 25% MCH + 25% TOL gave a maximum viscosity reduction of 67% while the RSM model proved that the combination of 57% EVA + 20.5% MCH + 22.5% TOL gave a maximum viscosity reduction of up to 61%.

Keywords: Asphaltene, ethylene-vinyl acetate, methylcyclohexane, toluene, wax.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
8016 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: Fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 674
8015 Effects of Varying Air Temperature in the Polishing Component of Single-Pass Mill on the Quality of Rice

Authors: M. A. U. Baradi, F. B. Bulao, N. D. Ganotisi, M. Jose C. Regalado, F. P. Bongat, S. B. Manglinong, M. L. O. Quigao, N. G. T. Martinez, R. G. Ancheta, M. P. Ortal

Abstract:

The effects of varying air temperature (full, ¾ full, ½ full aircon adjustment, no aircon) in polishing component of Single-Pass Mill on the quality of Philippine inbred rice variety, was investigated. Parameters measured were milling recovery (MR), headrice recovery (HR), and percentage with bran streaks. Cooling method (with aircon) increased MR, HR, and percentage with bran streaks of milled rice. Highest MR and HR (67.62%; 47.33%) were obtained from ¾ full adjustment whereas no aircon were lowest (66.27%; 39.76%). Temperature in polishing component at ¾ full adjustment was 33oC whereas no aircon was 45oC. There was increase of 1.35% in MR and 7.57% in HR. Additional cost of milling per kg due to aircon cooling was P0.04 at 300 tons/yr volume, with 0.15 yr payback period. Net income was estimated at ₱98,100.00. Percentage of kernels with bran streaks increased from 5%–14%, indicating more nutrients of milled rice.

Keywords: Aircon, air temperature, polishing component, quality, Single-Pass Mill.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
8014 A Complexity Measure for Java Bean based Software Components

Authors: Sandeep Khimta, Parvinder S. Sandhu, Amanpreet Singh Brar

Abstract:

The traditional software product and process metrics are neither suitable nor sufficient in measuring the complexity of software components, which ultimately is necessary for quality and productivity improvement within organizations adopting CBSE. Researchers have proposed a wide range of complexity metrics for software systems. However, these metrics are not sufficient for components and component-based system and are restricted to the module-oriented systems and object-oriented systems. In this proposed study it is proposed to find the complexity of the JavaBean Software Components as a reflection of its quality and the component can be adopted accordingly to make it more reusable. The proposed metric involves only the design issues of the component and does not consider the packaging and the deployment complexity. In this way, the software components could be kept in certain limit which in turn help in enhancing the quality and productivity.

Keywords: JavaBean Components, Complexity, Metrics, Validation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
8013 Dynamic Load Modeling for KHUZESTAN Power System Voltage Stability Studies

Authors: M. Sedighizadeh, A. Rezazadeh

Abstract:

Based on the component approach, three kinds of dynamic load models, including a single –motor model, a two-motor model and composite load model have been developed for the stability studies of Khuzestan power system. The study results are presented in this paper. Voltage instability is a dynamic phenomenon and therefore requires dynamic representation of the power system components. Industrial loads contain a large fraction of induction machines. Several models of different complexity are available for the description investigations. This study evaluates the dynamic performances of several dynamic load models in combination with the dynamics of a load changing transformer. Case study is steel industrial substation in Khuzestan power systems.

Keywords: Dynamic load, modeling, Voltage Stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1859
8012 M-band Wavelet and Cosine Transform Based Watermark Algorithm Using Randomization and Principal Component Analysis

Authors: Tong Liu, Xuan Xu, Xiaodi Wang

Abstract:

Computational techniques derived from digital image processing are playing a significant role in the security and digital copyrights of multimedia and visual arts. This technology has the effect within the domain of computers. This research presents discrete M-band wavelet transform (MWT) and cosine transform (DCT) based watermarking algorithm by incorporating the principal component analysis (PCA). The proposed algorithm is expected to achieve higher perceptual transparency. Specifically, the developed watermarking scheme can successfully resist common signal processing, such as geometric distortions, and Gaussian noise. In addition, the proposed algorithm can be parameterized, thus resulting in more security. To meet these requirements, the image is transformed by a combination of MWT & DCT. In order to improve the security further, we randomize the watermark image to create three code books. During the watermark embedding, PCA is applied to the coefficients in approximation sub-band. Finally, first few component bands represent an excellent domain for inserting the watermark.

Keywords: discrete M-band wavelet transform , discrete M-band wavelet transform, randomized watermark, principal component analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2009
8011 Minimizing Fish-feed Loss due to Sea Currents: An Economic Methodology

Authors: V. Vassiliou, M. Charalambides, M. Menicou

Abstract:

Fish-feed is a major cost component of operating expenses for any aquaculture farm. Due to soaring prices of fish-feed ingredients, the need for better feeding schedule management has become imperative. On such factor that influences the utilization rate of fish-feed are sea currents. Up to now, practical monitoring of fishfeed loss due to sea currents is not exercised. This paper gives a description of an economic methodology that aims at quantifying the amount of fish-feed lost due to sea currents and draws on data from a Mediterranean aquaculture farm to formulate the associated model.

Keywords: Aquaculture, economic model, fish-feed loss, sea currents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1825
8010 Context for Simplicity: A Basis for Context-aware Systems Based on the 3GPP Generic User Profile

Authors: Enrico Rukzio, George N. Prezerakos, Giovanni Cortese, Eleftherios Koutsoloukas, Sofia Kapellaki

Abstract:

The paper focuses on the area of context modeling with respect to the specification of context-aware systems supporting ubiquitous applications. The proposed approach, followed within the SIMPLICITY IST project, uses a high-level system ontology to derive context models for system components which consequently are mapped to the system's physical entities. For the definition of user and device-related context models in particular, the paper suggests a standard-based process consisting of an analysis phase using the Common Information Model (CIM) methodology followed by an implementation phase that defines 3GPP based components. The benefits of this approach are further depicted by preliminary examples of XML grammars defining profiles and components, component instances, coupled with descriptions of respective ubiquitous applications.

Keywords: 3GPP, context, context-awareness, context model, information model, user model, XML

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8774