Search results for: Independent Component Analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9650

Search results for: Independent Component Analysis

9560 Evaluation Process for the Hardware Safety Integrity Level

Authors: Sung Kyu Kim, Yong Soo Kim

Abstract:

Safety instrumented systems (SISs) are becoming increasingly complex and the proportion of programmable electronic parts is growing. The IEC 61508 global standard was established to ensure the functional safety of SISs, but it was expressed in highly macroscopic terms. This study introduces an evaluation process for hardware safety integrity levels through failure modes, effects, and diagnostic analysis (FMEDA).FMEDA is widely used to evaluate safety levels, and it provides the information on failure rates and failure mode distributions necessary to calculate a diagnostic coverage factor for a given component. In our evaluation process, the components of the SIS subsystem are first defined in terms of failure modes and effects. Then, the failure rate and failure mechanism distribution are assigned to each component. The safety mode and detectability of each failure mode are determined for each component. Finally, the hardware safety integrity level is evaluated based on the calculated results.

Keywords: Safety instrumented system; Safety integrity level; Failure modes, effects, and diagnostic analysis; IEC 61508.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2531
9559 Building the Reliability Prediction Model of Component-Based Software Architectures

Authors: Pham Thanh Trung, Huynh Quyet Thang

Abstract:

Reliability is one of the most important quality attributes of software. Based on the approach of Reussner and the approach of Cheung, we proposed the reliability prediction model of component-based software architectures. Also, the value of the model is shown through the experimental evaluation on a web server system.

Keywords: component-based architecture, reliability prediction model, software reliability engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1423
9558 Reliability Analysis of k-out-of-n : G System Using Triangular Intuitionistic Fuzzy Numbers

Authors: Tanuj Kumar, Rakesh Kumar Bajaj

Abstract:

In the present paper, we analyze the vague reliability of k-out-of-n : G system (particularly, series and parallel system) with independent and non-identically distributed components, where the reliability of the components are unknown. The reliability of each component has been estimated using statistical confidence interval approach. Then we converted these statistical confidence interval into triangular intuitionistic fuzzy numbers. Based on these triangular intuitionistic fuzzy numbers, the reliability of the k-out-of-n : G system has been calculated. Further, in order to implement the proposed methodology and to analyze the results of k-out-of-n : G system, a numerical example has been provided.

Keywords: Vague set, vague reliability, triangular intuitionistic fuzzy number, k-out-of-n : G system, series and parallel system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2982
9557 Dimension Reduction of Microarray Data Based on Local Principal Component

Authors: Ali Anaissi, Paul J. Kennedy, Madhu Goyal

Abstract:

Analysis and visualization of microarraydata is veryassistantfor biologists and clinicians in the field of diagnosis and treatment of patients. It allows Clinicians to better understand the structure of microarray and facilitates understanding gene expression in cells. However, microarray dataset is a complex data set and has thousands of features and a very small number of observations. This very high dimensional data set often contains some noise, non-useful information and a small number of relevant features for disease or genotype. This paper proposes a non-linear dimensionality reduction algorithm Local Principal Component (LPC) which aims to maps high dimensional data to a lower dimensional space. The reduced data represents the most important variables underlying the original data. Experimental results and comparisons are presented to show the quality of the proposed algorithm. Moreover, experiments also show how this algorithm reduces high dimensional data whilst preserving the neighbourhoods of the points in the low dimensional space as in the high dimensional space.

Keywords: Linear Dimension Reduction, Non-Linear Dimension Reduction, Principal Component Analysis, Biologists.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
9556 Quantitative Analysis of Weld Defect Images in Industrial Radiography Based Invariant Attributes

Authors: N. Nacereddine, M. Tridi, S. S. Belaïfa, M. Zelmat

Abstract:

For the characterization of the weld defect region in the radiographic image, looking for features which are invariant regarding the geometrical transformations (rotation, translation and scaling) proves to be necessary because the same defect can be seen from several angles according to the orientation and the distance from the welded framework to the radiation source. Thus, panoply of geometrical attributes satisfying the above conditions is proposed and which result from the calculation of the geometrical parameters (surface, perimeter, etc.) on the one hand and the calculation of the different order moments, on the other hand. Because the large range in values of the raw features and taking into account other considerations imposed by some classifiers, the scaling of these values to lie between 0 and 1 is indispensable. The principal component analysis technique is used in order to reduce the number of the attribute variables in the aim to give better performance to the further defect classification.

Keywords: Geometric parameters, invariant attributes, principal component analysis, weld defect image.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2181
9555 The Robust Clustering with Reduction Dimension

Authors: Dyah E. Herwindiati

Abstract:

A clustering is process to identify a homogeneous groups of object called as cluster. Clustering is one interesting topic on data mining. A group or class behaves similarly characteristics. This paper discusses a robust clustering process for data images with two reduction dimension approaches; i.e. the two dimensional principal component analysis (2DPCA) and principal component analysis (PCA). A standard approach to overcome this problem is dimension reduction, which transforms a high-dimensional data into a lower-dimensional space with limited loss of information. One of the most common forms of dimensionality reduction is the principal components analysis (PCA). The 2DPCA is often called a variant of principal component (PCA), the image matrices were directly treated as 2D matrices; they do not need to be transformed into a vector so that the covariance matrix of image can be constructed directly using the original image matrices. The decomposed classical covariance matrix is very sensitive to outlying observations. The objective of paper is to compare the performance of robust minimizing vector variance (MVV) in the two dimensional projection PCA (2DPCA) and the PCA for clustering on an arbitrary data image when outliers are hiden in the data set. The simulation aspects of robustness and the illustration of clustering images are discussed in the end of paper

Keywords: Breakdown point, Consistency, 2DPCA, PCA, Outlier, Vector Variance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1698
9554 Complexity of Component-based Development of Embedded Systems

Authors: M. Zheng, V. S. Alagar

Abstract:

The paper discusses complexity of component-based development (CBD) of embedded systems. Although CBD has its merits, it must be augmented with methods to control the complexities that arise due to resource constraints, timeliness, and run-time deployment of components in embedded system development. Software component specification, system-level testing, and run-time reliability measurement are some ways to control the complexity.

Keywords: Components, embedded systems, complexity, softwaredevelopment, traffic controller system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1499
9553 Application of Artificial Neural Network for Predicting Maintainability Using Object-Oriented Metrics

Authors: K. K. Aggarwal, Yogesh Singh, Arvinder Kaur, Ruchika Malhotra

Abstract:

Importance of software quality is increasing leading to development of new sophisticated techniques, which can be used in constructing models for predicting quality attributes. One such technique is Artificial Neural Network (ANN). This paper examined the application of ANN for software quality prediction using Object- Oriented (OO) metrics. Quality estimation includes estimating maintainability of software. The dependent variable in our study was maintenance effort. The independent variables were principal components of eight OO metrics. The results showed that the Mean Absolute Relative Error (MARE) was 0.265 of ANN model. Thus we found that ANN method was useful in constructing software quality model.

Keywords: Software quality, Measurement, Metrics, Artificial neural network, Coupling, Cohesion, Inheritance, Principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2574
9552 Electronic Nose Based On Metal Oxide Semiconductor Sensors as an Alternative Technique for the Spoilage Classification of Oat Milk

Authors: A. Deswal, N. S. Deora, H. N. Mishra

Abstract:

The aim of the present study was to develop a rapid method for electronic nose for online quality control of oat milk. Analysis by electronic nose and bacteriological measurements were performed to analyze spoilage kinetics of oat milk samples stored at room temperature and refrigerated conditions for up to 15 days. Principal component analysis (PCA), Discriminant Factorial Analysis (DFA) and Soft Independent Modelling by Class Analogy (SIMCA) classification techniques were used to differentiate the samples of oat milk at different days. The total plate count (bacteriological method) was selected as the reference method to consistently train the electronic nose system. The e-nose was able to differentiate between the oat milk samples of varying microbial load. The results obtained by the bacteria total viable countsshowed that the shelf-life of oat milk stored at room temperature and refrigerated conditions were 20hrs and 13 days, respectively. The models built classified oat milk samples based on the total microbial population into “unspoiled” and “spoiled”.

Keywords: Electronic-nose, bacteriological, shelf-life, classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3273
9551 A Dynamic Programming Model for Maintenance of Electric Distribution System

Authors: Juha Korpijärvi, Jari Kortelainen

Abstract:

The paper presents dynamic programming based model as a planning tool for the maintenance of electric power systems. Every distribution component has an exponential age depending reliability function to model the fault risk. In the moment of time when the fault costs exceed the investment costs of the new component the reinvestment of the component should be made. However, in some cases the overhauling of the old component may be more economical than the reinvestment. The comparison between overhauling and reinvestment is made by optimisation process. The goal of the optimisation process is to find the cost minimising maintenance program for electric power distribution system.

Keywords: Dynamic programming, Electric distribution system, Maintenance

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2106
9550 Blind Source Separation Using Modified Gaussian FastICA

Authors: V. K. Ananthashayana, Jyothirmayi M.

Abstract:

This paper addresses the problem of source separation in images. We propose a FastICA algorithm employing a modified Gaussian contrast function for the Blind Source Separation. Experimental result shows that the proposed Modified Gaussian FastICA is effectively used for Blind Source Separation to obtain better quality images. In this paper, a comparative study has been made with other popular existing algorithms. The peak signal to noise ratio (PSNR) and improved signal to noise ratio (ISNR) are used as metrics for evaluating the quality of images. The ICA metric Amari error is also used to measure the quality of separation.

Keywords: Amari error, Blind Source Separation, Contrast function, Gaussian function, Independent Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
9549 Chilean Wines Classification based only on Aroma Information

Authors: Nicolás H. Beltrán, Manuel A. Duarte-Mermoud, Víctor A. Soto, Sebastián A. Salah, and Matías A. Bustos

Abstract:

Results of Chilean wine classification based on the information provided by an electronic nose are reported in this paper. The classification scheme consists of two parts; in the first stage, Principal Component Analysis is used as feature extraction method to reduce the dimensionality of the original information. Then, Radial Basis Functions Neural Networks is used as pattern recognition technique to perform the classification. The objective of this study is to classify different Cabernet Sauvignon, Merlot and Carménère wine samples from different years, valleys and vineyards of Chile.

Keywords: Feature extraction techniques, Pattern recognitiontechniques, Principal component analysis, Radial basis functionsneural networks, Wine classification.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1547
9548 Fault Detection and Isolation using RBF Networks for Polymer Electrolyte Membrane Fuel Cell

Authors: Mahanijah Md Kamal., Dingli Yu

Abstract:

This paper presents a new method of fault detection and isolation (FDI) for polymer electrolyte membrane (PEM) fuel cell (FC) dynamic systems under an open-loop scheme. This method uses a radial basis function (RBF) neural network to perform fault identification, classification and isolation. The novelty is that the RBF model of independent mode is used to predict the future outputs of the FC stack. One actuator fault, one component fault and three sensor faults have been introduced to the PEMFC systems experience faults between -7% to +10% of fault size in real-time operation. To validate the results, a benchmark model developed by Michigan University is used in the simulation to investigate the effect of these five faults. The developed independent RBF model is tested on MATLAB R2009a/Simulink environment. The simulation results confirm the effectiveness of the proposed method for FDI under an open-loop condition. By using this method, the RBF networks able to detect and isolate all five faults accordingly and accurately.

Keywords: Polymer electrolyte membrane fuel cell, radial basis function neural networks, fault detection, fault isolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1814
9547 A New Approach for Classifying Large Number of Mixed Variables

Authors: Hashibah Hamid

Abstract:

The issue of classifying objects into one of predefined groups when the measured variables are mixed with different types of variables has been part of interest among statisticians in many years. Some methods for dealing with such situation have been introduced that include parametric, semi-parametric and nonparametric approaches. This paper attempts to discuss on a problem in classifying a data when the number of measured mixed variables is larger than the size of the sample. A propose idea that integrates a dimensionality reduction technique via principal component analysis and a discriminant function based on the location model is discussed. The study aims in offering practitioners another potential tool in a classification problem that is possible to be considered when the observed variables are mixed and too large.

Keywords: classification, location model, mixed variables, principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1557
9546 An Investigation into Kanji Character Discrimination Process from EEG Signals

Authors: Hiroshi Abe, Minoru Nakayama

Abstract:

The frontal area in the brain is known to be involved in behavioral judgement. Because a Kanji character can be discriminated visually and linguistically from other characters, in Kanji character discrimination, we hypothesized that frontal event-related potential (ERP) waveforms reflect two discrimination processes in separate time periods: one based on visual analysis and the other based on lexcical access. To examine this hypothesis, we recorded ERPs while performing a Kanji lexical decision task. In this task, either a known Kanji character, an unknown Kanji character or a symbol was presented and the subject had to report if the presented character was a known Kanji character for the subject or not. The same response was required for unknown Kanji trials and symbol trials. As a preprocessing of signals, we examined the performance of a method using independent component analysis for artifact rejection and found it was effective. Therefore we used it. In the ERP results, there were two time periods in which the frontal ERP wavefoms were significantly different betweeen the unknown Kanji trials and the symbol trials: around 170ms and around 300ms after stimulus onset. This result supported our hypothesis. In addition, the result suggests that Kanji character lexical access may be fully completed by around 260ms after stimulus onset.

Keywords: Character discrimination, Event-related Potential, IndependentComponent Analysis, Kanji, Lexical access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1784
9545 A Comparative Analysis of Fuzzy, Neuro-Fuzzy and Fuzzy-GA Based Approaches for Software Reusability Evaluation

Authors: Parvinder Singh Sandhu, Dalwinder Singh Salaria, Hardeep Singh

Abstract:

Software Reusability is primary attribute of software quality. There are metrics for identifying the quality of reusable components but the function that makes use of these metrics to find reusability of software components is still not clear. These metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the component and hence improve the productivity due to probabilistic increase in the reuse level. In this paper, we have devised the framework of metrics that uses McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component as input attributes and calculated reusability of the software component. Here, comparative analysis of the fuzzy, Neuro-fuzzy and Fuzzy-GA approaches is performed to evaluate the reusability of software components and Fuzzy-GA results outperform the other used approaches. The developed reusability model has produced high precision results as expected by the human experts.

Keywords: Software Reusability, Software Metrics, Neural Networks, Genetic Algorithm, Fuzzy Logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1816
9544 Automatic Classification of Initial Categories of Alzheimer's Disease from Structural MRI Phase Images: A Comparison of PSVM, KNN and ANN Methods

Authors: Ahsan Bin Tufail, Ali Abidi, Adil Masood Siddiqui, Muhammad Shahzad Younis

Abstract:

An early and accurate detection of Alzheimer's disease (AD) is an important stage in the treatment of individuals suffering from AD. We present an approach based on the use of structural magnetic resonance imaging (sMRI) phase images to distinguish between normal controls (NC), mild cognitive impairment (MCI) and AD patients with clinical dementia rating (CDR) of 1. Independent component analysis (ICA) technique is used for extracting useful features which form the inputs to the support vector machines (SVM), K nearest neighbour (kNN) and multilayer artificial neural network (ANN) classifiers to discriminate between the three classes. The obtained results are encouraging in terms of classification accuracy and effectively ascertain the usefulness of phase images for the classification of different stages of Alzheimer-s disease.

Keywords: Biomedical image processing, classification algorithms, feature extraction, statistical learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2766
9543 Modeling and Simulation of In-vessel Core Handling in PFBR Operator Training Simulator

Authors: Bindu Sankar, Jaideep Chakraborty, Rashmi Nawlakha, A. Venkatesan, S. Raghupathy, T. Jayanthi, S.A.V. Satya Murty

Abstract:

Component handling system is one of the important sub systems of Prototype Fast Breeder Reactor (PFBR) used for fuel handling. Core handling system is again a sub system of component handling system. Core handling system consists of in-vessel and ex-vessel subassembly handling. In-vessel core handling involves transfer arm, large rotatable plug and small rotatable plug operations. Modeling and simulation of in-vessel core handling is a part of development of Prototype Fast Breeder Reactor Operator Training Simulator. This paper deals with simulation and modeling of operations of transfer arm, large rotatable plug and small rotatable plug needed for in-vessel core handling. Process modeling was developed in house using platform independent Cµ code with OpenGL (Open Graphics Library). The control logic models and virtual panel were modeled using simulation tool.

Keywords: Animation, Core Handling System, Prototype Fast Breeder Reactor, Simulator

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
9542 Factors Influencing Students' Self-Concept among Malaysian Students

Authors: Z. Ishak, S. Jamaluddin, F.P Chew

Abstract:

This paper examines the students’ self-concept among 16- and 17- year- old adolescents in Malaysian secondary schools. Previous studies have shown that positive self-concept played an important role in student adjustment and academic performance during schooling. This study attempts to investigate the factors influencing students’ perceptions toward their own self-concept. A total of 1168 students participated in the survey. This study utilized the CoPs (UM) instrument to measure self-concept. Principal Component Analysis (PCA) revealed three factors: academic selfconcept, physical self-concept and social self-concept. This study confirmed that students perceived certain internal context factors, and revealed that external context factor also have an impact on their self-concept.

Keywords: Academic self-concept, physical self-concept, Principal Component Analysis (PCA), social self-concept.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2518
9541 Face Recognition Using Principal Component Analysis, K-Means Clustering, and Convolutional Neural Network

Authors: Zukisa Nante, Wang Zenghui

Abstract:

Face recognition is the problem of identifying or recognizing individuals in an image. This paper investigates a possible method to bring a solution to this problem. The method proposes an amalgamation of Principal Component Analysis (PCA), K-Means clustering, and Convolutional Neural Network (CNN) for a face recognition system. It is trained and evaluated using the ORL dataset. This dataset consists of 400 different faces with 40 classes of 10 face images per class. Firstly, PCA enabled the usage of a smaller network. This reduces the training time of the CNN. Thus, we get rid of the redundancy and preserve the variance with a smaller number of coefficients. Secondly, the K-Means clustering model is trained using the compressed PCA obtained data which select the K-Means clustering centers with better characteristics. Lastly, the K-Means characteristics or features are an initial value of the CNN and act as input data. The accuracy and the performance of the proposed method were tested in comparison to other Face Recognition (FR) techniques namely PCA, Support Vector Machine (SVM), as well as K-Nearest Neighbour (kNN). During experimentation, the accuracy and the performance of our suggested method after 90 epochs achieved the highest performance: 99% accuracy F1-Score, 99% precision, and 99% recall in 463.934 seconds. It outperformed the PCA that obtained 97% and KNN with 84% during the conducted experiments. Therefore, this method proved to be efficient in identifying faces in the images.

Keywords: Face recognition, Principal Component Analysis, PCA, Convolutional Neural Network, CNN, Rectified Linear Unit, ReLU, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 508
9540 A Multivariate Statistical Approach for Water Quality Assessment of River Hindon, India

Authors: Nida Rizvi, Deeksha Katyal, Varun Joshi

Abstract:

River Hindon is an important river catering the demand of highly populated rural and industrial cluster of western Uttar Pradesh, India. Water quality of river Hindon is deteriorating at an alarming rate due to various industrial, municipal and agricultural activities. The present study aimed at identifying the pollution sources and quantifying the degree to which these sources are responsible for the deteriorating water quality of the river. Various water quality parameters, like pH, temperature, electrical conductivity, total dissolved solids, total hardness, calcium, chloride, nitrate, sulphate, biological oxygen demand, chemical oxygen demand, and total alkalinity were assessed. Water quality data obtained from eight study sites for one year has been subjected to the two multivariate techniques, namely, principal component analysis and cluster analysis. Principal component analysis was applied with the aim to find out spatial variability and to identify the sources responsible for the water quality of the river. Three Varifactors were obtained after varimax rotation of initial principal components using principal component analysis. Cluster analysis was carried out to classify sampling stations of certain similarity, which grouped eight different sites into two clusters. The study reveals that the anthropogenic influence (municipal, industrial, waste water and agricultural runoff) was the major source of river water pollution. Thus, this study illustrates the utility of multivariate statistical techniques for analysis and elucidation of multifaceted data sets, recognition of pollution sources/factors and understanding temporal/spatial variations in water quality for effective river water quality management.

Keywords: Cluster analysis, multivariate statistical technique, river Hindon, water Quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3816
9539 Forensic Speaker Verification in Noisy Environmental by Enhancing the Speech Signal Using ICA Approach

Authors: Ahmed Kamil Hasan Al-Ali, Bouchra Senadji, Ganesh Naik

Abstract:

We propose a system to real environmental noise and channel mismatch for forensic speaker verification systems. This method is based on suppressing various types of real environmental noise by using independent component analysis (ICA) algorithm. The enhanced speech signal is applied to mel frequency cepstral coefficients (MFCC) or MFCC feature warping to extract the essential characteristics of the speech signal. Channel effects are reduced using an intermediate vector (i-vector) and probabilistic linear discriminant analysis (PLDA) approach for classification. The proposed algorithm is evaluated by using an Australian forensic voice comparison database, combined with car, street and home noises from QUT-NOISE at a signal to noise ratio (SNR) ranging from -10 dB to 10 dB. Experimental results indicate that the MFCC feature warping-ICA achieves a reduction in equal error rate about (48.22%, 44.66%, and 50.07%) over using MFCC feature warping when the test speech signals are corrupted with random sessions of street, car, and home noises at -10 dB SNR.

Keywords: Noisy forensic speaker verification, ICA algorithm, MFCC, MFCC feature warping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 991
9538 Air Quality Forecast Based on Principal Component Analysis-Genetic Algorithm and Back Propagation Model

Authors: Bin Mu, Site Li, Shijin Yuan

Abstract:

Under the circumstance of environment deterioration, people are increasingly concerned about the quality of the environment, especially air quality. As a result, it is of great value to give accurate and timely forecast of AQI (air quality index). In order to simplify influencing factors of air quality in a city, and forecast the city’s AQI tomorrow, this study used MATLAB software and adopted the method of constructing a mathematic model of PCA-GABP to provide a solution. To be specific, this study firstly made principal component analysis (PCA) of influencing factors of AQI tomorrow including aspects of weather, industry waste gas and IAQI data today. Then, we used the back propagation neural network model (BP), which is optimized by genetic algorithm (GA), to give forecast of AQI tomorrow. In order to verify validity and accuracy of PCA-GABP model’s forecast capability. The study uses two statistical indices to evaluate AQI forecast results (normalized mean square error and fractional bias). Eventually, this study reduces mean square error by optimizing individual gene structure in genetic algorithm and adjusting the parameters of back propagation model. To conclude, the performance of the model to forecast AQI is comparatively convincing and the model is expected to take positive effect in AQI forecast in the future.

Keywords: AQI forecast, principal component analysis, genetic algorithm, back propagation neural network model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1029
9537 Degradation of Heating, Ventilation, and Air Conditioning Components across Locations

Authors: Timothy E. Frank, Josh R. Aldred, Sophie B. Boulware, Michelle K. Cabonce, Justin H. White

Abstract:

Materials degrade at different rates in different environments depending on factors such as temperature, aridity, salinity, and solar radiation. Therefore, predicting asset longevity depends, in part, on the environmental conditions to which the asset is exposed. Heating, ventilation, and air conditioning (HVAC) systems are critical to building operations yet are responsible for a significant proportion of their energy consumption. HVAC energy use increases substantially with slight operational inefficiencies. Understanding the environmental influences on HVAC degradation in detail will inform maintenance schedules and capital investment, reduce energy use, and increase lifecycle management efficiency. HVAC inspection records spanning 14 years from 21 locations across the United States were compiled and associated with the climate conditions to which they were exposed. Three environmental features were explored in this study: average high temperature, average low temperature, and annual precipitation, as well as four non-environmental features. Initial insights showed no correlations between individual features and the rate of HVAC component degradation. Using neighborhood component analysis, however, the most critical features related to degradation were identified. Two models were considered, and results varied between them. However, longitude and latitude emerged as potentially the best predictors of average HVAC component degradation. Further research is needed to evaluate additional environmental features, increase the resolution of the environmental data, and develop more robust models to achieve more conclusive results.

Keywords: Climate, infrastructure degradation, HVAC, neighborhood component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173
9536 Sensitivity Analysis in Power Systems Reliability Evaluation

Authors: A.R Alesaadi, M. Nafar, A.H. Gheisari

Abstract:

In this paper sensitivity analysis is performed for reliability evaluation of power systems. When examining the reliability of a system, it is useful to recognize how results change as component parameters are varied. This knowledge helps engineers to understand the impact of poor data, and gives insight on how reliability can be improved. For these reasons, a sensitivity analysis can be performed. Finally, a real network was used for testing the presented method.

Keywords: sensitivity analysis, reliability evaluation, powersystems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2274
9535 Tongue Diagnosis System Based on PCA and SVM

Authors: Jin-Woong Park, Sun-Kyung Kang, Sung-Tae Jung

Abstract:

In this study, we propose a tongue diagnosis method which detects the tongue from face image and divides the tongue area into six areas, and finally generates tongue coating ratio of each area. To detect the tongue area from face image, we use ASM as one of the active shape models. Detected tongue area is divided into six areas widely used in the Korean traditional medicine and the distribution of tongue coating of the six areas is examined by SVM(Support Vector Machine). For SVM, we use a 3-dimensional vector calculated by PCA(Principal Component Analysis) from a 12-dimentional vector consisting of RGB, HIS, Lab, and Luv. As a result, we detected the tongue area stably using ASM and found that PCA and SVM helped raise the ratio of tongue coating detection.

Keywords: Active Shape Model, Principal Component Analysis, Support Vector Machine, Tongue diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1867
9534 Normalizing Flow to Augmented Posterior: Conditional Density Estimation with Interpretable Dimension Reduction for High Dimensional Data

Authors: Cheng Zeng, George Michailidis, Hitoshi Iyatomi, Leo L Duan

Abstract:

The conditional density characterizes the distribution of a response variable y given other predictor x, and plays a key role in many statistical tasks, including classification and outlier detection. Although there has been abundant work on the problem of Conditional Density Estimation (CDE) for a low-dimensional response in the presence of a high-dimensional predictor, little work has been done for a high-dimensional response such as images. The promising performance of normalizing flow (NF) neural networks in unconditional density estimation acts a motivating starting point. In this work, we extend NF neural networks when external x is present. Specifically, they use the NF to parameterize a one-to-one transform between a high-dimensional y and a latent z that comprises two components [zP , zN]. The zP component is a low-dimensional subvector obtained from the posterior distribution of an elementary predictive model for x, such as logistic/linear regression. The zN component is a high-dimensional independent Gaussian vector, which explains the variations in y not or less related to x. Unlike existing CDE methods, the proposed approach, coined Augmented Posterior CDE (AP-CDE), only requires a simple modification on the common normalizing flow framework, while significantly improving the interpretation of the latent component, since zP represents a supervised dimension reduction. In image analytics applications, AP-CDE shows good separation of x-related variations due to factors such as lighting condition and subject id, from the other random variations. Further, the experiments show that an unconditional NF neural network, based on an unsupervised model of z, such as Gaussian mixture, fails to generate interpretable results.

Keywords: Conditional density estimation, image generation, normalizing flow, supervised dimension reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167
9533 Fault Detection of Drinking Water Treatment Process Using PCA and Hotelling's T2 Chart

Authors: Joval P George, Dr. Zheng Chen, Philip Shaw

Abstract:

This paper deals with the application of Principal Component Analysis (PCA) and the Hotelling-s T2 Chart, using data collected from a drinking water treatment process. PCA is applied primarily for the dimensional reduction of the collected data. The Hotelling-s T2 control chart was used for the fault detection of the process. The data was taken from a United Utilities Multistage Water Treatment Works downloaded from an Integrated Program Management (IPM) dashboard system. The analysis of the results show that Multivariate Statistical Process Control (MSPC) techniques such as PCA, and control charts such as Hotelling-s T2, can be effectively applied for the early fault detection of continuous multivariable processes such as Drinking Water Treatment. The software package SIMCA-P was used to develop the MSPC models and Hotelling-s T2 Chart from the collected data.

Keywords: Principal component analysis, hotelling's t2 chart, multivariate statistical process control, drinking water treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2786
9532 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: Irrigation, principal component analysis, reference evapotranspiration, Vaalharts.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1062
9531 The Nature of the Complicated Fabric Textures: How to Represent in Primary Visual Cortex

Authors: J. L. Liu, L. Wang, B. Zhu, J. Zhou, W. D. Gao

Abstract:

Fabric textures are very common in our daily life. However, the representation of fabric textures has never been explored from neuroscience view. Theoretical studies suggest that primary visual cortex (V1) uses a sparse code to efficiently represent natural images. However, how the simple cells in V1 encode the artificial textures is still a mystery. So, here we will take fabric texture as stimulus to study the response of independent component analysis that is established to model the receptive field of simple cells in V1. We choose 140 types of fabrics to get the classical fabric textures as materials. Experiment results indicate that the receptive fields of simple cells have obvious selectivity in orientation, frequency and phase when drifting gratings are used to determine their tuning properties. Additionally, the distribution of optimal orientation and frequency shows that the patch size selected from each original fabric image has a significant effect on the frequency selectivity.

Keywords: Fabric Texture, Receptive Filed, Simple Cell, Spare Coding.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1476