Search results for: Kernel Principal Component Analysis (KPCA).
9370 Normalization Discriminant Independent Component Analysis
Authors: Liew Yee Ping, Pang Ying Han, Lau Siong Hoe, Ooi Shih Yin, Housam Khalifa Bashier Babiker
Abstract:
In face recognition, feature extraction techniques attempts to search for appropriate representation of the data. However, when the feature dimension is larger than the samples size, it brings performance degradation. Hence, we propose a method called Normalization Discriminant Independent Component Analysis (NDICA). The input data will be regularized to obtain the most reliable features from the data and processed using Independent Component Analysis (ICA). The proposed method is evaluated on three face databases, Olivetti Research Ltd (ORL), Face Recognition Technology (FERET) and Face Recognition Grand Challenge (FRGC). NDICA showed it effectiveness compared with other unsupervised and supervised techniques.
Keywords: Face recognition, small sample size, regularization, independent component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19549369 Dynamical Analysis of Circadian Gene Expression
Authors: Carla Layana Luis Diambra
Abstract:
Microarrays technique allows the simultaneous measurements of the expression levels of thousands of mRNAs. By mining this data one can identify the dynamics of the gene expression time series. By recourse of principal component analysis, we uncover the circadian rhythmic patterns underlying the gene expression profiles from Cyanobacterium Synechocystis. We applied PCA to reduce the dimensionality of the data set. Examination of the components also provides insight into the underlying factors measured in the experiments. Our results suggest that all rhythmic content of data can be reduced to three main components.
Keywords: circadian rhythms, clustering, gene expression, PCA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15929368 Development of Palm Kernel Shell Lightweight Masonry Mortar
Authors: Kazeem K. Adewole
Abstract:
There need to construct building walls with lightweight masonry bricks/blocks and mortar to reduce the weight and cost of cooling/heating of buildings in hot/cold climates is growing partly due to legislations on energy use and global warming. In this paper, the development of Palm Kernel Shell masonry mortar (PKSMM) prepared with Portland cement and crushed PKS fine aggregate (an agricultural waste) is demonstrated. We show that PKSMM can be used as a lightweight mortar for the construction of lightweight masonry walls with good thermal insulation efficiency than the natural river sand commonly used for masonry mortar production.
Keywords: Building walls, fine aggregate, lightweight masonry mortar, palm kernel shell, wall thermal insulation efficacy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11549367 Comparison of Polynomial and Radial Basis Kernel Functions based SVR and MLR in Modeling Mass Transfer by Vertical and Inclined Multiple Plunging Jets
Abstract:
Presently various computational techniques are used in modeling and analyzing environmental engineering data. In the present study, an intra-comparison of polynomial and radial basis kernel functions based on Support Vector Regression and, in turn, an inter-comparison with Multi Linear Regression has been attempted in modeling mass transfer capacity of vertical (θ = 90O) and inclined (θ multiple plunging jets (varying from 1 to 16 numbers). The data set used in this study consists of four input parameters with a total of eighty eight cases, forty four each for vertical and inclined multiple plunging jets. For testing, tenfold cross validation was used. Correlation coefficient values of 0.971 and 0.981 along with corresponding root mean square error values of 0.0025 and 0.0020 were achieved by using polynomial and radial basis kernel functions based Support Vector Regression respectively. An intra-comparison suggests improved performance by radial basis function in comparison to polynomial kernel based Support Vector Regression. Further, an inter-comparison with Multi Linear Regression (correlation coefficient = 0.973 and root mean square error = 0.0024) reveals that radial basis kernel functions based Support Vector Regression performs better in modeling and estimating mass transfer by multiple plunging jets.Keywords: Mass transfer, multiple plunging jets, polynomial and radial basis kernel functions, Support Vector Regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14339366 On the Efficient Implementation of a Serial and Parallel Decomposition Algorithm for Fast Support Vector Machine Training Including a Multi-Parameter Kernel
Authors: Tatjana Eitrich, Bruno Lang
Abstract:
This work deals with aspects of support vector machine learning for large-scale data mining tasks. Based on a decomposition algorithm for support vector machine training that can be run in serial as well as shared memory parallel mode we introduce a transformation of the training data that allows for the usage of an expensive generalized kernel without additional costs. We present experiments for the Gaussian kernel, but usage of other kernel functions is possible, too. In order to further speed up the decomposition algorithm we analyze the critical problem of working set selection for large training data sets. In addition, we analyze the influence of the working set sizes onto the scalability of the parallel decomposition scheme. Our tests and conclusions led to several modifications of the algorithm and the improvement of overall support vector machine learning performance. Our method allows for using extensive parameter search methods to optimize classification accuracy.
Keywords: Support Vector Machine Training, Multi-ParameterKernels, Shared Memory Parallel Computing, Large Data
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14439365 Quadratic Pulse Inversion Ultrasonic Imaging(QPI): A Two-Step Procedure for Optimization of Contrast Sensitivity and Specificity
Authors: Mamoun F. Al-Mistarihi
Abstract:
We have previously introduced an ultrasonic imaging approach that combines harmonic-sensitive pulse sequences with a post-beamforming quadratic kernel derived from a second-order Volterra filter (SOVF). This approach is designed to produce images with high sensitivity to nonlinear oscillations from microbubble ultrasound contrast agents (UCA) while maintaining high levels of noise rejection. In this paper, a two-step algorithm for computing the coefficients of the quadratic kernel leading to reduction of tissue component introduced by motion, maximizing the noise rejection and increases the specificity while optimizing the sensitivity to the UCA is presented. In the first step, quadratic kernels from individual singular modes of the PI data matrix are compared in terms of their ability of maximize the contrast to tissue ratio (CTR). In the second step, quadratic kernels resulting in the highest CTR values are convolved. The imaging results indicate that a signal processing approach to this clinical challenge is feasible.Keywords: Volterra Filter, Pulse Inversion, Ultrasonic Imaging, Contrast Agent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15899364 Steam Gasification of Palm Kernel Shell (PKS): Effect of Fe/BEA and Ni/BEA Catalysts and Steam to Biomass Ratio on Composition of Gaseous Products
Authors: M.F. Mohamad, Anita Ramli, S.E.E Misi, S. Yusup
Abstract:
This work presents the hydrogen production from steam gasification of palm kernel shell (PKS) at 700 oC in the presence of 5% Ni/BEA and 5% Fe/BEA as catalysts. The steam gasification was performed in two-staged reactors to evaluate the effect of calcinations temperature and the steam to biomass ratio on the product gas composition. The catalytic activity of Ni/BEA catalyst decreases with increasing calcinations temperatures from 500 to 700 oC. The highest H2 concentration is produced by Fe/BEA (600) with more than 71 vol%. The catalytic activity of the catalysts tested is found to correspond to its physicochemical properties. The optimum range for steam to biomass ratio if found to be between 2 to 4. Excess steam content results in temperature drop in the gasifier which is undesirable for the gasification reactions.Keywords: Hydrogen, Palm Kernel Shell, Steam gasification, Ni/BEA, Fe/BEA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22329363 Issues in Spectral Source Separation Techniques for Plant-wide Oscillation Detection and Diagnosis
Authors: A.K. Tangirala, S. Babji
Abstract:
In the last few years, three multivariate spectral analysis techniques namely, Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Non-negative Matrix Factorization (NMF) have emerged as effective tools for oscillation detection and isolation. While the first method is used in determining the number of oscillatory sources, the latter two methods are used to identify source signatures by formulating the detection problem as a source identification problem in the spectral domain. In this paper, we present a critical drawback of the underlying linear (mixing) model which strongly limits the ability of the associated source separation methods to determine the number of sources and/or identify the physical source signatures. It is shown that the assumed mixing model is only valid if each unit of the process gives equal weighting (all-pass filter) to all oscillatory components in its inputs. This is in contrast to the fact that each unit, in general, acts as a filter with non-uniform frequency response. Thus, the model can only facilitate correct identification of a source with a single frequency component, which is again unrealistic. To overcome this deficiency, an iterative post-processing algorithm that correctly identifies the physical source(s) is developed. An additional issue with the existing methods is that they lack a procedure to pre-screen non-oscillatory/noisy measurements which obscure the identification of oscillatory sources. In this regard, a pre-screening procedure is prescribed based on the notion of sparseness index to eliminate the noisy and non-oscillatory measurements from the data set used for analysis.Keywords: non-negative matrix factorization, PCA, source separation, plant-wide diagnosis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15349362 Estimation of Component Reusability through Reusability Metrics
Authors: Aditya Pratap Singh, Pradeep Tomar
Abstract:
Software reusability is an essential characteristic of Component-Based Software (CBS). The component reusability is an important assess for the effective reuse of components in CBS. The attributes of reusability proposed by various researchers are studied and four of them are identified as potential factors affecting reusability. This paper proposes metric for reusability estimation of black-box software component along with metrics for Interface Complexity, Understandability, Customizability and Reliability. An experiment is performed for estimation of reusability through a case study on a sample web application using a real world component.
Keywords: Component-based software, component reusability, customizability, interface complexity, reliability, understandability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30589361 Evolutionary Eigenspace Learning using CCIPCA and IPCA for Face Recognition
Authors: Ghazy M.R. Assassa, Mona F. M. Mursi, Hatim A. Aboalsamh
Abstract:
Traditional principal components analysis (PCA) techniques for face recognition are based on batch-mode training using a pre-available image set. Real world applications require that the training set be dynamic of evolving nature where within the framework of continuous learning, new training images are continuously added to the original set; this would trigger a costly continuous re-computation of the eigen space representation via repeating an entire batch-based training that includes the old and new images. Incremental PCA methods allow adding new images and updating the PCA representation. In this paper, two incremental PCA approaches, CCIPCA and IPCA, are examined and compared. Besides, different learning and testing strategies are proposed and applied to the two algorithms. The results suggest that batch PCA is inferior to both incremental approaches, and that all CCIPCAs are practically equivalent.Keywords: Candid covariance-free incremental principal components analysis (CCIPCA), face recognition, incremental principal components analysis (IPCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18229360 Effect of Fault Depth on Near-Fault Peak Ground Velocity
Authors: Yanyan Yu, Haiping Ding, Pengjun Chen, Yiou Sun
Abstract:
Fault depth is an important parameter to be determined in ground motion simulation, and peak ground velocity (PGV) demonstrates good application prospect. Using numerical simulation method, the variations of distribution and peak value of near-fault PGV with different fault depth were studied in detail, and the reason of some phenomena were discussed. The simulation results show that the distribution characteristics of PGV of fault-parallel (FP) component and fault-normal (FN) component are distinctly different; the value of PGV FN component is much larger than that of FP component. With the increase of fault depth, the distribution region of the FN component strong PGV moves forward along the rupture direction, while the strong PGV zone of FP component becomes gradually far away from the fault trace along the direction perpendicular to the strike. However, no matter FN component or FP component, the strong PGV distribution area and its value are both quickly reduced with increased fault depth. The results above suggest that the fault depth have significant effect on both FN component and FP component of near-fault PGV.Keywords: Fault depth, near-fault, PGV, numerical simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7829359 Nonlinear Equations with N-dimensional Telegraph Operator Iterated K-times
Authors: Jessada Tariboon
Abstract:
In this article, using distribution kernel, we study the nonlinear equations with n-dimensional telegraph operator iterated k-times.
Keywords: Telegraph operator, Elementary solution, Distribution kernel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11979358 Phase Behaviors and Fuel Properties of Bio-Oil-Diesel-Alcohol Blends
Authors: P. Weerachanchai, C. Tangsathitkulchai, M. Tangsathitkulchai
Abstract:
Attempt was made to improve certain characteristics of bio-oil derived from palm kernel pyrolysis by blending it with diesel fuel and alcohols. Two types of alcohol, ethanol or butanol, was used as cosolvent to stabilize the phase of ternary systems. Phase behaviors and basic fuel properties of palm kernel bio-oildiesel- alcohol systems were investigated in this study. Alcohol types showed a significant influence on the phase characteristics with palm kernel bio-oil-diesel-butanol system giving larger soluble area than that of palm kernel bio-oil-diesel-ethanol system. For fuel properties, blended fuels showed superior properties including lower values of density (~860 kg/m3 at 25°C), viscosity (~4.12 mm2/s at 40°C), carbon residue (1.02-2.53 wt%), ash (0.018-0.034 wt%) and pour point (<-25 to -7 °C), increased pH (~ 6.4) and giving reasonable heating values of 32.5-41.2 MJ/kg. To enable the prediction of some properties of fuel mixtures, the measured fuel properties including heating value, density, ash content and pH were fitted by Kay-s mixing rule, whereas the viscosities of blended fuels at different temperatures were correlated by the modified Grunberg-Nissan equation and Andrade equation.
Keywords: Bio-oil, fuel blend, fuel properties, phase behaviour.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38339357 Fuzzy Modeling Tool for Creating a Component Model of Information System
Authors: Bogdan Walek, Jiri Bartos, Cyril Klimes, Jaroslav Prochazka, Pavel Smolka, Juraj Masar, Martin Pesl
Abstract:
This paper focuses on creating a component model of information system under uncertainty. The paper identifies problem in current approach of component modeling and proposes fuzzy tool, which will work with vague customer requirements and propose components of the resulting component model. The proposed tool is verified on specific information system and results are shown in paper. After finding suitable sub-components of the resulting component model, the component model is visualised by tool.
Keywords: Component, component model, fuzzy, fuzzy rules, fuzzy sets, information system, modelling, tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16439356 Empirical Process Monitoring Via Chemometric Analysis of Partially Unbalanced Data
Authors: Hyun-Woo Cho
Abstract:
Real-time or in-line process monitoring frameworks are designed to give early warnings for a fault along with meaningful identification of its assignable causes. In artificial intelligence and machine learning fields of pattern recognition various promising approaches have been proposed such as kernel-based nonlinear machine learning techniques. This work presents a kernel-based empirical monitoring scheme for batch type production processes with small sample size problem of partially unbalanced data. Measurement data of normal operations are easy to collect whilst special events or faults data are difficult to collect. In such situations, noise filtering techniques can be helpful in enhancing process monitoring performance. Furthermore, preprocessing of raw process data is used to get rid of unwanted variation of data. The performance of the monitoring scheme was demonstrated using three-dimensional batch data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.
Keywords: Process Monitoring, kernel methods, multivariate filtering, data-driven techniques, quality improvement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17469355 A Critical Survey of Reusability Aspects for Component-Based Systems
Authors: Arun Sharma, Rajesh Kumar, P. S. Grover
Abstract:
The last decade has shown that object-oriented concept by itself is not that powerful to cope with the rapidly changing requirements of ongoing applications. Component-based systems achieve flexibility by clearly separating the stable parts of systems (i.e. the components) from the specification of their composition. In order to realize the reuse of components effectively in CBSD, it is required to measure the reusability of components. However, due to the black-box nature of components where the source code of these components are not available, it is difficult to use conventional metrics in Component-based Development as these metrics require analysis of source codes. In this paper, we survey few existing component-based reusability metrics. These metrics give a border view of component-s understandability, adaptability, and portability. It also describes the analysis, in terms of quality factors related to reusability, contained in an approach that aids significantly in assessing existing components for reusability.Keywords: Components, Customizability, Reusability, and Observability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24699354 Analysis of Palm Perspiration Effect with SVM for Diabetes in People
Authors: Hamdi Melih Saraoğlu, Muhlis Yıldırım, Abdurrahman Özbeyaz, Feyzullah Temurtas
Abstract:
In this research, the diabetes conditions of people (healthy, prediabete and diabete) were tried to be identified with noninvasive palm perspiration measurements. Data clusters gathered from 200 subjects were used (1.Individual Attributes Cluster and 2. Palm Perspiration Attributes Cluster). To decrase the dimensions of these data clusters, Principal Component Analysis Method was used. Data clusters, prepared in that way, were classified with Support Vector Machines. Classifications with highest success were 82% for Glucose parameters and 84% for HbA1c parametres.
Keywords: Palm perspiration, Diabetes, Support Vector Machine, Classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19469353 Diagnosis of Ovarian Cancer with Proteomic Patterns in Serum using Independent Component Analysis and Neural Networks
Authors: Simone C. F. Neves, Lúcio F. A. Campos, Ewaldo Santana, Ginalber L. O. Serra, Allan K. Barros
Abstract:
We propose a method for discrimination and classification of ovarian with benign, malignant and normal tissue using independent component analysis and neural networks. The method was tested for a proteomic patters set from A database, and radial basis functions neural networks. The best performance was obtained with probabilistic neural networks, resulting I 99% success rate, with 98% of specificity e 100% of sensitivity.Keywords: Cancer ovarian, Proteomic patterns in serum, independent component analysis and neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18319352 Exergy Analysis of a Cogeneration Plant
Authors: Derya Burcu Ozkan, Onur Kiziler, Duriye Bilge
Abstract:
Cogeneration may be defined as a system which contains electricity production and regain of the thermo value of exhaust gases simultaneously. The examination is based on the data-s of an active cogeneration plant. This study, it is aimed to determine which component of the system should be revised first to raise the efficiency and decrease the loss of exergy. For this purpose, second law analysis of thermodynamics is applied to each component due to consider the effects of environmental conditions and take the quality of energy into consideration as well as the quantity of it. The exergy balance equations are produced and exergy loss is calculated for each component. 44,44 % loss of exergy in heat exchanger, 29,59 % in combustion chamber, 18,68 % in steam boiler, 5,25 % in gas turbine and 2,03 % in compressor is calculated.Keywords: Cogeneration, Exergy loss, Second law analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25179351 An Approach for Blind Source Separation using the Sliding DFT and Time Domain Independent Component Analysis
Authors: Koji Yamanouchi, Masaru Fujieda, Takahiro Murakami, Yoshihisa Ishida
Abstract:
''Cocktail party problem'' is well known as one of the human auditory abilities. We can recognize the specific sound that we want to listen by this ability even if a lot of undesirable sounds or noises are mixed. Blind source separation (BSS) based on independent component analysis (ICA) is one of the methods by which we can separate only a special signal from their mixed signals with simple hypothesis. In this paper, we propose an online approach for blind source separation using the sliding DFT and the time domain independent component analysis. The proposed method can reduce calculation complexity in comparison with conventional methods, and can be applied to parallel processing by using digital signal processors (DSPs) and so on. We evaluate this method and show its availability.Keywords: Cocktail party problem, blind Source Separation(BSS), independent component analysis, sliding DFT, onlineprocessing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16389350 3D Face Recognition Using Modified PCA Methods
Authors: Omid Gervei, Ahmad Ayatollahi, Navid Gervei
Abstract:
In this paper we present an approach for 3D face recognition based on extracting principal components of range images by utilizing modified PCA methods namely 2DPCA and bidirectional 2DPCA also known as (2D) 2 PCA.A preprocessing stage was implemented on the images to smooth them using median and Gaussian filtering. In the normalization stage we locate the nose tip to lay it at the center of images then crop each image to a standard size of 100*100. In the face recognition stage we extract the principal component of each image using both 2DPCA and (2D) 2 PCA. Finally, we use Euclidean distance to measure the minimum distance between a given test image to the training images in the database. We also compare the result of using both methods. The best result achieved by experiments on a public face database shows that 83.3 percent is the rate of face recognition for a random facial expression.Keywords: 3D face recognition, 2DPCA, (2D) 2 PCA, Rangeimage
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30669349 Theoretical Considerations for Software Component Metrics
Authors: V. Lakshmi Narasimhan, Bayu Hendradjaya
Abstract:
We have defined two suites of metrics, which cover static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined to form a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate the degree of utilisation of various components. In this paper both static and dynamic metrics are evaluated using Weyuker-s set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly. We relate our metrics suite with McCall-s Quality Model and illustrate their impact on product quality and to the management of component-based product development.Keywords: Component Assembly, Component Based SoftwareEngineering, CORBA Component Model, Software ComponentMetrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22819348 Knowledge Management Factors Affecting the Level of Commitment
Authors: Abbas Keramati, Abtin Boostani, Mohammad Jamal Sadeghi
Abstract:
This paper examines the influence of knowledge management factors on organizational commitment for employees in the oil and gas drilling industry of Iran. We determine what knowledge factors have the greatest impact on the personnel loyalty and commitment to the organization using collected data from a survey of over 300 full-time personnel working in three large companies active in oil and gas drilling industry of Iran. To specify the effect of knowledge factors in the organizational commitment of the personnel in the studied organizations, the Principal Component Analysis (PCA) is used. Findings of our study show that the factors such as knowledge and expertise, in-service training, the knowledge value and the application of individuals’ knowledge in the organization as the factor “learning and perception of personnel from the value of knowledge within the organization” has the greatest impact on the organizational commitment. After this factor, “existence of knowledge and knowledge sharing environment in the organization”; “existence of potential knowledge exchanging in the organization”; and “organizational knowledge level” factors have the most impact on the organizational commitment of personnel, respectively.
Keywords: Knowledge management, organizational commitment, loyalty, drilling industry, principle component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8769347 Autonomously Determining the Parameters for SVDD with RBF Kernel from a One-Class Training Set
Authors: Andreas Theissler, Ian Dear
Abstract:
The one-class support vector machine “support vector data description” (SVDD) is an ideal approach for anomaly or outlier detection. However, for the applicability of SVDD in real-world applications, the ease of use is crucial. The results of SVDD are massively determined by the choice of the regularisation parameter C and the kernel parameter of the widely used RBF kernel. While for two-class SVMs the parameters can be tuned using cross-validation based on the confusion matrix, for a one-class SVM this is not possible, because only true positives and false negatives can occur during training. This paper proposes an approach to find the optimal set of parameters for SVDD solely based on a training set from one class and without any user parameterisation. Results on artificial and real data sets are presented, underpinning the usefulness of the approach.
Keywords: Support vector data description, anomaly detection, one-class classification, parameter tuning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29359346 Multi-view Description of Real-Time Systems- Architecture
Authors: A. Bessam, M. T. Kimour
Abstract:
Real-time embedded systems should benefit from component-based software engineering to handle complexity and deal with dependability. In these systems, applications should not only be logically correct but also behave within time windows. However, in the current component based software engineering approaches, a few of component models handles time properties in a manner that allows efficient analysis and checking at the architectural level. In this paper, we present a meta-model for component-based software description that integrates timing issues. To achieve a complete functional model of software components, our meta-model focuses on four functional aspects: interface, static behavior, dynamic behavior, and interaction protocol. With each aspect we have explicitly associated a time model. Such a time model can be used to check a component-s design against certain properties and to compute the timing properties of component assemblies.Keywords: Real-time systems, Software architecture, software component, dependability, time properties, ADL, metamodeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16369345 Hydrodynamic Analysis of Reservoir Due to Vertical Component of Earthquake Using an Analytical Solution
Authors: M. Pasbani Khiavi, M. A. Ghorbani
Abstract:
This paper presents an analytical solution to get a reliable estimation of the hydrodynamic pressure on gravity dams induced by vertical component earthquake when solving the fluid and dam interaction problem. Presented analytical technique is presented for calculation of earthquake-induced hydrodynamic pressure in the reservoir of gravity dams allowing for water compressibility and wave absorption at the reservoir bottom. This new analytical solution can take into account the effect of bottom material on seismic response of gravity dams. It is concluded that because the vertical component of ground motion causes significant hydrodynamic forces in the horizontal direction on a vertical upstream face, responses to the vertical component of ground motion are of special importance in analysis of concrete gravity dams subjected to earthquakes.
Keywords: Dam, Reservoir, Analytical solution, Vertical component, Earthquake
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17509344 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower
Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta
Abstract:
Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.Keywords: EAS, Shower, Core, ANN, Location.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16089343 Free Fatty Acid Assessment of Crude Palm Oil Using a Non-Destructive Approach
Authors: Siti Nurhidayah Naqiah Abdull Rani, Herlina Abdul Rahim, Rashidah Ghazali, Noramli Abdul Razak
Abstract:
Near infrared (NIR) spectroscopy has always been of great interest in the food and agriculture industries. The development of prediction models has facilitated the estimation process in recent years. In this study, 110 crude palm oil (CPO) samples were used to build a free fatty acid (FFA) prediction model. 60% of the collected data were used for training purposes and the remaining 40% used for testing. The visible peaks on the NIR spectrum were at 1725 nm and 1760 nm, indicating the existence of the first overtone of C-H bands. Principal component regression (PCR) was applied to the data in order to build this mathematical prediction model. The optimal number of principal components was 10. The results showed R2=0.7147 for the training set and R2=0.6404 for the testing set.
Keywords: Palm oil, fatty acid, NIRS, regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43709342 Using PFA in Feature Analysis and Selection for H.264 Adaptation
Authors: Nora A. Naguib, Ahmed E. Hussein, Hesham A. Keshk, Mohamed I. El-Adawy
Abstract:
Classification of video sequences based on their contents is a vital process for adaptation techniques. It helps decide which adaptation technique best fits the resource reduction requested by the client. In this paper we used the principal feature analysis algorithm to select a reduced subset of video features. The main idea is to select only one feature from each class based on the similarities between the features within that class. Our results showed that using this feature reduction technique the source video features can be completely omitted from future classification of video sequences.
Keywords: Adaptation, feature selection, H.264, Principal Feature Analysis (PFA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16079341 Performance Assessment and Optimization of the After-Sale Networks
Authors: H. Izadbakhsh, M.Hour Ali, A. Amirkhani, A. Montazeri, M. Saberi
Abstract:
The after–sales activities are nowadays acknowledged as a relevant source of revenue, profit and competitive advantage in most manufacturing industries. Top and middle management, therefore, should focus on the definition of a structured business performance measurement system for the after-sales business. The paper aims at filling this gap, and presents an integrated methodology for the after-sales network performance measurement, and provides an empirical application to automotive case companies and their official service network. This is the first study that presents an integrated multivariate approach for total assessment and improvement of after-sale services.Keywords: Data Envelopment Analysis (DEA), Principal Component Analysis (PCA), Automotive companies, After-sale services.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885