Search results for: Independent Component Analysis
9470 Study of the Effect of the Contra-Rotating Component on the Performance of the Centrifugal Compressor
Authors: Van Thang Nguyen, Amelie Danlos, Richard Paridaens, Farid Bakir
Abstract:
This article presents a study of the effect of a contra-rotating component on the efficiency of centrifugal compressors. A contra-rotating centrifugal compressor (CRCC) is constructed using two independent rotors, rotating in the opposite direction and replacing the single rotor of a conventional centrifugal compressor (REF). To respect the geometrical parameters of the REF one, two rotors of the CRCC are designed, based on a single rotor geometry, using the hub and shroud length ratio parameter of the meridional contour. Firstly, the first rotor is designed by choosing a value of length ratio. Then, the second rotor is calculated to be adapted to the fluid flow of the first rotor according aerodynamics principles. In this study, four values of length ratios 0.3, 0.4, 0.5, and 0.6 are used to create four configurations CF1, CF2, CF3, and CF4 respectively. For comparison purpose, the circumferential velocity at the outlet of the REF and the CRCC are preserved, which means that the single rotor of the REF and the second rotor of the CRCC rotate with the same speed of 16000rpm. The speed of the first rotor in this case is chosen to be equal to the speed of the second rotor. The CFD simulation is conducted to compare the performance of the CRCC and the REF with the same boundary conditions. The results show that the configuration with a higher length ratio gives higher pressure rise. However, its efficiency is lower. An investigation over the entire operating range shows that the CF1 is the best configuration in this case. In addition, the CRCC can improve the pressure rise as well as the efficiency by changing the speed of each rotor independently. The results of changing the first rotor speed show with a 130% speed increase, the pressure ratio rises of 8.7% while the efficiency remains stable at the flow rate of the design operating point.Keywords: Centrifugal compressor, contra-rotating, interaction rotor, vacuum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8299469 Monitoring Blood Pressure Using Regression Techniques
Authors: Qasem Qananwah, Ahmad Dagamseh, Hiam AlQuran, Khalid Shaker Ibrahim
Abstract:
Blood pressure helps the physicians greatly to have a deep insight into the cardiovascular system. The determination of individual blood pressure is a standard clinical procedure considered for cardiovascular system problems. The conventional techniques to measure blood pressure (e.g. cuff method) allows a limited number of readings for a certain period (e.g. every 5-10 minutes). Additionally, these systems cause turbulence to blood flow; impeding continuous blood pressure monitoring, especially in emergency cases or critically ill persons. In this paper, the most important statistical features in the photoplethysmogram (PPG) signals were extracted to estimate the blood pressure noninvasively. PPG signals from more than 40 subjects were measured and analyzed and 12 features were extracted. The features were fed to principal component analysis (PCA) to find the most important independent features that have the highest correlation with blood pressure. The results show that the stiffness index means and standard deviation for the beat-to-beat heart rate were the most important features. A model representing both features for Systolic Blood Pressure (SBP) and Diastolic Blood Pressure (DBP) was obtained using a statistical regression technique. Surface fitting is used to best fit the series of data and the results show that the error value in estimating the SBP is 4.95% and in estimating the DBP is 3.99%.
Keywords: Blood pressure, noninvasive optical system, PCA, continuous monitoring.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6879468 Native Language Identification with Cross-Corpus Evaluation Using Social Media Data: 'Reddit'
Authors: Yasmeen Bassas, Sandra Kuebler, Allen Riddell
Abstract:
Native Language Identification is one of the growing subfields in Natural Language Processing (NLP). The task of Native Language Identification (NLI) is mainly concerned with predicting the native language of an author’s writing in a second language. In this paper, we investigate the performance of two types of features; content-based features vs. content independent features when they are evaluated on a different corpus (using social media data “Reddit”). In this NLI task, the predefined models are trained on one corpus (TOEFL) and then the trained models are evaluated on a different data using an external corpus (Reddit). Three classifiers are used in this task; the baseline, linear SVM, and Logistic Regression. Results show that content-based features are more accurate and robust than content independent ones when tested within corpus and across corpus.
Keywords: NLI, NLP, content-based features, content independent features, social media corpus, ML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4169467 Comparison of Different Data Acquisition Techniques for Shape Optimization Problems
Authors: Attila Vámosi, Tamás Mankovits, Dávid Huri, Imre Kocsis, Tamás Szabó
Abstract:
Non-linear FEM calculations are indispensable when important technical information like operating performance of a rubber component is desired. For example rubber bumpers built into air-spring structures may undergo large deformations under load, which in itself shows non-linear behavior. The changing contact range between the parts and the incompressibility of the rubber increases this non-linear behavior further. The material characterization of an elastomeric component is also a demanding engineering task. The shape optimization problem of rubber parts led to the study of FEM based calculation processes. This type of problems was posed and investigated by several authors. In this paper the time demand of certain calculation methods are studied and the possibilities of time reduction is presented.
Keywords: Rubber bumper, data acquisition, finite element analysis, support vector regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21489466 Knowledge Management Factors Affecting the Level of Commitment
Authors: Abbas Keramati, Abtin Boostani, Mohammad Jamal Sadeghi
Abstract:
This paper examines the influence of knowledge management factors on organizational commitment for employees in the oil and gas drilling industry of Iran. We determine what knowledge factors have the greatest impact on the personnel loyalty and commitment to the organization using collected data from a survey of over 300 full-time personnel working in three large companies active in oil and gas drilling industry of Iran. To specify the effect of knowledge factors in the organizational commitment of the personnel in the studied organizations, the Principal Component Analysis (PCA) is used. Findings of our study show that the factors such as knowledge and expertise, in-service training, the knowledge value and the application of individuals’ knowledge in the organization as the factor “learning and perception of personnel from the value of knowledge within the organization” has the greatest impact on the organizational commitment. After this factor, “existence of knowledge and knowledge sharing environment in the organization”; “existence of potential knowledge exchanging in the organization”; and “organizational knowledge level” factors have the most impact on the organizational commitment of personnel, respectively.
Keywords: Knowledge management, organizational commitment, loyalty, drilling industry, principle component analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8779465 Measuring Process Component Design on Achieving Managerial Goals
Authors: Eakong Atiptamvaree, Twittie Senivongse
Abstract:
Process-oriented software development is a new software development paradigm in which software design is modeled by a business process which is in turn translated into a process execution language for execution. The building blocks of this paradigm are software units that are composed together to work according to the flow of the business process. This new paradigm still exhibits the characteristic of the applications built with the traditional software component technology. This paper discusses an approach to apply a traditional technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses, and these process components can be reused to design the business processes of other application domains. The decomposition considers five managerial goals, namely cost effectiveness, ease of assembly, customization, reusability, and maintainability. The paper presents how to design or decompose process components from a business process model and measure some technical features of the design that would affect the managerial goals. A comparison between the measurement values from different designs can tell which process component design is more appropriate for the managerial goals that have been set. The proposed approach can be applied in Web Services environment which accommodates process-oriented software development.Keywords: Business Process Model, Managerial Goals, ProcessComponent.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15139464 HClO4-SiO2 Nanoparticles as an Efficient Catalyst for Three-Component Synthesis of Triazolo[1,2-a]Indazole- Triones
Authors: Hossein Anaraki-Ardakani, Tayebe Heidari-Rakati
Abstract:
An environmentally benign protocol for the one-pot, three-component synthesis of Triazolo[1,2-a]indazole-1,3,8-trione derivatives by condensation of dimedone, urazole and aromatic aldehydes catalyzed by HClO4/SiO2 NPS as an ecofriendly catalyst with high catalytic activity and reusability at 100ºC under solventfree conditions is reported. The reaction proceeds to completion within 20-30 min in 77-86% yield.
Keywords: One-pot reaction, Dimedone, Triazoloindazole, Urazole.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22289463 Analysis of Bio-Oil Produced by Pyrolysis of Coconut Shell
Authors: D. S. Fardhyanti, A. Damayanti
Abstract:
The utilization of biomass as a source of new and renewable energy is being carried out. One of the technologies to convert biomass as an energy source is pyrolysis which is converting biomass into more valuable products, such as bio-oil. Bio-oil is a liquid which is produced by steam condensation process from the pyrolysis of coconut shells. The composition of a coconut shell e.g. hemicellulose, cellulose and lignin will be oxidized to phenolic compounds as the main component of the bio-oil. The phenolic compounds in bio-oil are corrosive; they cause various difficulties in the combustion system because of a high viscosity, low calorific value, corrosiveness, and instability. Phenolic compounds are very valuable components which phenol has used as the main component for the manufacture of antiseptic, disinfectant (known as Lysol) and deodorizer. The experiments typically occurred at the atmospheric pressure in a pyrolysis reactor at temperatures ranging from 300 oC to 350 oC with a heating rate of 10 oC/min and a holding time of 1 hour at the pyrolysis temperature. The Gas Chromatography-Mass Spectroscopy (GC-MS) was used to analyze the bio-oil components. The obtained bio-oil has the viscosity of 1.46 cP, the density of 1.50 g/cm3, the calorific value of 16.9 MJ/kg, and the molecular weight of 1996.64. By GC-MS, the analysis of bio-oil showed that it contained phenol (40.01%), ethyl ester (37.60%), 2-methoxy-phenol (7.02%), furfural (5.45%), formic acid (4.02%), 1-hydroxy-2-butanone (3.89%), and 3-methyl-1,2-cyclopentanedione (2.01%).
Keywords: Bio-oil, pyrolysis, coconut shell, phenol, gas chromatography-mass spectroscopy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17739462 Loop Back Connected Component Labeling Algorithm and Its Implementation in Detecting Face
Authors: A. Rakhmadi, M. S. M. Rahim, A. Bade, H. Haron, I. M. Amin
Abstract:
In this study, a Loop Back Algorithm for component connected labeling for detecting objects in a digital image is presented. The approach is using loop back connected component labeling algorithm that helps the system to distinguish the object detected according to their label. Deferent than whole window scanning technique, this technique reduces the searching time for locating the object by focusing on the suspected object based on certain features defined. In this study, the approach was also implemented for a face detection system. Face detection system is becoming interesting research since there are many devices or systems that require detecting the face for certain purposes. The input can be from still image or videos, therefore the sub process of this system has to be simple, efficient and accurate to give a good result.Keywords: Image processing, connected components labeling, face detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23009461 Implementation of a Serializer to Represent PHP Objects in the Extensible Markup Language
Authors: Lidia N. Hernández-Piña, Carlos R. Jaimez-González
Abstract:
Interoperability in distributed systems is an important feature that refers to the communication of two applications written in different programming languages. This paper presents a serializer and a de-serializer of PHP objects to and from XML, which is an independent library written in the PHP programming language. The XML generated by this serializer is independent of the programming language, and can be used by other existing Web Objects in XML (WOX) serializers and de-serializers, which allow interoperability with other object-oriented programming languages.Keywords: Interoperability, PHP object serialization, PHP to XML, web objects in XML, WOX.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7549460 Chaotic Properties of Hemodynamic Responsein Functional Near Infrared Spectroscopic Measurement of Brain Activity
Authors: Ni Ni Soe , Masahiro Nakagawa
Abstract:
Functional near infrared spectroscopy (fNIRS) is a practical non-invasive optical technique to detect characteristic of hemoglobin density dynamics response during functional activation of the cerebral cortex. In this paper, fNIRS measurements were made in the area of motor cortex from C4 position according to international 10-20 system. Three subjects, aged 23 - 30 years, were participated in the experiment. The aim of this paper was to evaluate the effects of different motor activation tasks of the hemoglobin density dynamics of fNIRS signal. The chaotic concept based on deterministic dynamics is an important feature in biological signal analysis. This paper employs the chaotic properties which is a novel method of nonlinear analysis, to analyze and to quantify the chaotic property in the time series of the hemoglobin dynamics of the various motor imagery tasks of fNIRS signal. Usually, hemoglobin density in the human brain cortex is found to change slowly in time. An inevitable noise caused by various factors is to be included in a signal. So, principle component analysis method (PCA) is utilized to remove high frequency component. The phase pace is reconstructed and evaluated the Lyapunov spectrum, and Lyapunov dimensions. From the experimental results, it can be conclude that the signals measured by fNIRS are chaotic.Keywords: Chaos, hemoglobin, Lyapunov spectrum, motorimagery, near infrared spectroscopy (NIRS), principal componentanalysis (PCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17289459 An Approach for Reducing the Computational Complexity of LAMSTAR Intrusion Detection System using Principal Component Analysis
Authors: V. Venkatachalam, S. Selvan
Abstract:
The security of computer networks plays a strategic role in modern computer systems. Intrusion Detection Systems (IDS) act as the 'second line of defense' placed inside a protected network, looking for known or potential threats in network traffic and/or audit data recorded by hosts. We developed an Intrusion Detection System using LAMSTAR neural network to learn patterns of normal and intrusive activities, to classify observed system activities and compared the performance of LAMSTAR IDS with other classification techniques using 5 classes of KDDCup99 data. LAMSAR IDS gives better performance at the cost of high Computational complexity, Training time and Testing time, when compared to other classification techniques (Binary Tree classifier, RBF classifier, Gaussian Mixture classifier). we further reduced the Computational Complexity of LAMSTAR IDS by reducing the dimension of the data using principal component analysis which in turn reduces the training and testing time with almost the same performance.Keywords: Binary Tree Classifier, Gaussian Mixture, IntrusionDetection System, LAMSTAR, Radial Basis Function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17519458 An Automatic Pipeline Monitoring System Based on PCA and SVM
Abstract:
This paper proposes a novel system for monitoring the health of underground pipelines. Some of these pipelines transport dangerous contents and any damage incurred might have catastrophic consequences. However, most of these damage are unintentional and usually a result of surrounding construction activities. In order to prevent these potential damages, monitoring systems are indispensable. This paper focuses on acoustically recognizing road cutters since they prelude most construction activities in modern cities. Acoustic recognition can be easily achieved by installing a distributed computing sensor network along the pipelines and using smart sensors to “listen" for potential threat; if there is a real threat, raise some form of alarm. For efficient pipeline monitoring, a novel monitoring approach is proposed. Principal Component Analysis (PCA) was studied and applied. Eigenvalues were regarded as the special signature that could characterize a sound sample, and were thus used for the feature vector for sound recognition. The denoising ability of PCA could make it robust to noise interference. One class SVM was used for classifier. On-site experiment results show that the proposed PCA and SVM based acoustic recognition system will be very effective with a low tendency for raising false alarms.Keywords: One class SVM, pipeline monitoring system, principal component analysis, sound recognition, third party damage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20189457 The Mechanism Underlying Empathy-Related Helping Behavior: An Investigation of Empathy-Attitude- Action Model
Authors: Wan-Ting Liao, Angela K. Tzeng
Abstract:
Empathy has been an important issue in psychology, education, as well as cognitive neuroscience. Empathy has two major components: cognitive and emotional. Cognitive component refers to the ability to understand others’ perspectives, thoughts, and actions, whereas emotional component refers to understand how others feel. Empathy can be induced, attitude can then be changed, and with enough attitude change, helping behavior can occur. This finding leads us to two questions: is attitude change really necessary for prosocial behavior? And, what roles cognitive and affective empathy play? For the second question, participants with different psychopathic personality (PP) traits are critical because high PP people were found to suffer only affective empathy deficit. Their cognitive empathy shows no significant difference from the control group. 132 college students voluntarily participated in the current three-stage study. Stage 1 was to collect basic information including Interpersonal Reactivity Index (IRI), Psychopathic Personality Inventory-Revised (PPI-R), Attitude Scale, Visual Analogue Scale (VAS), and demographic data. Stage two was for empathy induction with three controversial scenarios, namely domestic violence, depression with a suicide attempt, and an ex-offender. Participants read all three stories and then rewrite the stories by one of two perspectives (empathetic vs. objective). They would then complete the VAS and Attitude Scale one more time for their post-attitude and emotional status. Three IVs were introduced for data analysis: PP (High vs. Low), Responsibility (whether or not the character is responsible for what happened), and Perspective-taking (Empathic vs. Objective). Stage 3 was for the action. Participants were instructed to freely use the 17 tokens they received as donations. They were debriefed and interviewed at the end of the experiment. The major findings were people with higher empathy tend to take more action in helping. Attitude change is not necessary for prosocial behavior. The controversy of the scenarios and how familiar participants are towards target groups play very important roles. Finally, people with high PP tend to show more public prosocial behavior due to their affective empathy deficit. Pre-existing value and belief as well as recent dramatic social events seem to have a big impact and possibly reduce the effect of the independent variables (IV) in our paradigm.
Keywords: Affective empathy, attitude, cognitive empathy, prosocial behavior, psychopathic traits.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7139456 Combined Feature Based Hyperspectral Image Classification Technique Using Support Vector Machines
Authors: Mrs.K.Kavitha, S.Arivazhagan
Abstract:
A spatial classification technique incorporating a State of Art Feature Extraction algorithm is proposed in this paper for classifying a heterogeneous classes present in hyper spectral images. The classification accuracy can be improved if and only if both the feature extraction and classifier selection are proper. As the classes in the hyper spectral images are assumed to have different textures, textural classification is entertained. Run Length feature extraction is entailed along with the Principal Components and Independent Components. A Hyperspectral Image of Indiana Site taken by AVIRIS is inducted for the experiment. Among the original 220 bands, a subset of 120 bands is selected. Gray Level Run Length Matrix (GLRLM) is calculated for the selected forty bands. From GLRLMs the Run Length features for individual pixels are calculated. The Principle Components are calculated for other forty bands. Independent Components are calculated for next forty bands. As Principal & Independent Components have the ability to represent the textural content of pixels, they are treated as features. The summation of Run Length features, Principal Components, and Independent Components forms the Combined Features which are used for classification. SVM with Binary Hierarchical Tree is used to classify the hyper spectral image. Results are validated with ground truth and accuracies are calculated.
Keywords: Multi-class, Run Length features, PCA, ICA, classification and Support Vector Machines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15239455 Neuromarketing: Discovering the Somathyc Marker in the Consumer´s Brain
Authors: Mikel Alonso López, María Francisca Blasco López, Víctor Molero Ayala
Abstract:
The present study explains the somatic marker theory of Antonio Damasio, which indicates that when making a decision, the stored or possible future scenarios (future memory) images allow people to feel for a moment what would happen when they make a choice, and how this is emotionally marked. This process can be conscious or unconscious. The development of new Neuromarketing techniques such as functional magnetic resonance imaging (fMRI), carries a greater understanding of how the brain functions and consumer behavior. In the results observed in different studies using fMRI, the evidence suggests that the somatic marker and future memories influence the decision-making process, adding a positive or negative emotional component to the options. This would mean that all decisions would involve a present emotional component, with a rational cost-benefit analysis that can be performed later.
Keywords: Emotions, decision making, somatic marker, consumer´s brain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21379454 Hydrochemical Contamination Profiling and Spatial-Temporal Mapping with the Support of Multivariate and Cluster Statistical Analysis
Authors: S. Barbosa, M. Pinto, J. A. Almeida, E. Carvalho, C. Diamantino
Abstract:
The aim of this work was to test a methodology able to generate spatial-temporal maps that can synthesize simultaneously the trends of distinct hydrochemical indicators in an old radium-uranium tailings dam deposit. Multidimensionality reduction derived from principal component analysis and subsequent data aggregation derived from clustering analysis allow to identify distinct hydrochemical behavioral profiles and generate synthetic evolutionary hydrochemical maps.
Keywords: Contamination plume migration, K-means of PCA scores, groundwater and mine water monitoring, spatial-temporal hydrochemical trends.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6279453 A Robust Method for Encrypted Data Hiding Technique Based on Neighborhood Pixels Information
Authors: Ali Shariq Imran, M. Younus Javed, Naveed Sarfraz Khattak
Abstract:
This paper presents a novel method for data hiding based on neighborhood pixels information to calculate the number of bits that can be used for substitution and modified Least Significant Bits technique for data embedding. The modified solution is independent of the nature of the data to be hidden and gives correct results along with un-noticeable image degradation. The technique, to find the number of bits that can be used for data hiding, uses the green component of the image as it is less sensitive to human eye and thus it is totally impossible for human eye to predict whether the image is encrypted or not. The application further encrypts the data using a custom designed algorithm before embedding bits into image for further security. The overall process consists of three main modules namely embedding, encryption and extraction cm.
Keywords: Data hiding, image processing, information security, stagonography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23419452 Effect of Scarp Topography on Seismic Ground Motion
Authors: Haiping Ding, Rongchu Zhu, Zhenxia Song
Abstract:
Local irregular topography has a great impact on earthquake ground motion. For scarp topography, using numerical simulation method, the influence extent and scope of the scarp terrain on scarp's upside and downside ground motion are discussed in case of different vertical incident SV waves. The results show that: (1) The amplification factor of scarp's upside region is greater than that of the free surface, while the amplification factor of scarp's downside part is less than that of the free surface; (2) When the slope angle increases, for x component, amplification factors of the scarp upside also increase, while the downside part decrease with it. For z component, both of the upside and downside amplification factors will increase; (3) When the slope angle changes, the influence scope of scarp's downside part is almost unchanged, but for the upside part, it slightly becomes greater with the increase of slope angle; (4) Due to the existence of the scarp, the z component ground motion appears at the surface. Its amplification factor increases for larger slope angle, and the peaks of the surface responses are related with incident waves. However, the input wave has little effects on the x component amplification factors.Keywords: Scarp topography, ground motion, amplification factor, vertical incident wave.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8019451 Face Localization and Recognition in Varied Expressions and Illumination
Authors: Hui-Yu Huang, Shih-Hang Hsu
Abstract:
In this paper, we propose a robust scheme to work face alignment and recognition under various influences. For face representation, illumination influence and variable expressions are the important factors, especially the accuracy of facial localization and face recognition. In order to solve those of factors, we propose a robust approach to overcome these problems. This approach consists of two phases. One phase is preprocessed for face images by means of the proposed illumination normalization method. The location of facial features can fit more efficient and fast based on the proposed image blending. On the other hand, based on template matching, we further improve the active shape models (called as IASM) to locate the face shape more precise which can gain the recognized rate in the next phase. The other phase is to process feature extraction by using principal component analysis and face recognition by using support vector machine classifiers. The results show that this proposed method can obtain good facial localization and face recognition with varied illumination and local distortion.
Keywords: Gabor filter, improved active shape model (IASM), principal component analysis (PCA), face alignment, face recognition, support vector machine (SVM)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14919450 Face Recognition using Radial Basis Function Network based on LDA
Authors: Byung-Joo Oh
Abstract:
This paper describes a method to improve the robustness of a face recognition system based on the combination of two compensating classifiers. The face images are preprocessed by the appearance-based statistical approaches such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). LDA features of the face image are taken as the input of the Radial Basis Function Network (RBFN). The proposed approach has been tested on the ORL database. The experimental results show that the LDA+RBFN algorithm has achieved a recognition rate of 93.5%
Keywords: Face recognition, linear discriminant analysis, radial basis function network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21229449 Predicting Application Layer DDoS Attacks Using Machine Learning Algorithms
Authors: S. Umarani, D. Sharmila
Abstract:
A Distributed Denial of Service (DDoS) attack is a major threat to cyber security. It originates from the network layer or the application layer of compromised/attacker systems which are connected to the network. The impact of this attack ranges from the simple inconvenience to use a particular service to causing major failures at the targeted server. When there is heavy traffic flow to a target server, it is necessary to classify the legitimate access and attacks. In this paper, a novel method is proposed to detect DDoS attacks from the traces of traffic flow. An access matrix is created from the traces. As the access matrix is multi dimensional, Principle Component Analysis (PCA) is used to reduce the attributes used for detection. Two classifiers Naive Bayes and K-Nearest neighborhood are used to classify the traffic as normal or abnormal. The performance of the classifier with PCA selected attributes and actual attributes of access matrix is compared by the detection rate and False Positive Rate (FPR).
Keywords: Distributed Denial of Service (DDoS) attack, Application layer DDoS, DDoS Detection, K- Nearest neighborhood classifier, Naive Bayes Classifier, Principle Component Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 52799448 A New Analytical Approach to Reconstruct Residual Stresses Due to Turning Process
Authors: G.H. Farrahi, S.A. Faghidian, D.J. Smith
Abstract:
A thin layer on the component surface can be found with high tensile residual stresses, due to turning operations, which can dangerously affect the fatigue performance of the component. In this paper an analytical approach is presented to reconstruct the residual stress field from a limited incomplete set of measurements. Airy stress function is used as the primary unknown to directly solve the equilibrium equations and satisfying the boundary conditions. In this new method there exists the flexibility to impose the physical conditions that govern the behavior of residual stress to achieve a meaningful complete stress field. The analysis is also coupled to a least squares approximation and a regularization method to provide stability of the inverse problem. The power of this new method is then demonstrated by analyzing some experimental measurements and achieving a good agreement between the model prediction and the results obtained from residual stress measurement.Keywords: Residual stress, Limited measurements, Inverse problems, Turning process.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14329447 Redundancy Component Matrix and Structural Robustness
Authors: Xinjian Kou, Linlin Li, Yongju Zhou, Jimian Song
Abstract:
We introduce the redundancy matrix that expresses clearly the geometrical/topological configuration of the structure. With the matrix, the redundancy of the structure is resolved into redundant components and assigned to each member or rigid joint. The values of the diagonal elements in the matrix indicates the importance of the corresponding members or rigid joints, and the geometrically correlations can be shown with the non-diagonal elements. If a member or rigid joint failures, reassignment of the redundant components can be calculated with the recursive method given in the paper. By combining the indexes of reliability and redundancy components, we define an index concerning the structural robustness. To further explain the properties of the redundancy matrix, we cited several examples of statically indeterminate structures, including two trusses and a rigid frame. With the examples, some simple results and the properties of the matrix are discussed. The examples also illustrate that the redundancy matrix and the relevant concepts are valuable in structural safety analysis.
Keywords: Structural robustness, structural reliability, redundancy component, redundancy matrix.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11429446 Statistical Analysis of Failure Cases in Aerospace
Authors: J. H. Lv, W. Z. Wang, S.W. Liu
Abstract:
The major concern in the aviation industry is the flight safety. Although great effort has been put onto the development of material and system reliability, the failure cases of fatal accidents still occur nowadays. Due to the complexity of the aviation system, and the interaction among the failure components, the failure analysis of the related equipment is a little difficult. This study focuses on surveying the failure cases in aviation, which are extracted from failure analysis journals, including Engineering Failure Analysis and Case studies in Engineering Failure Analysis, in order to obtain the failure sensitive factors or failure sensitive parts. The analytical results show that, among the failure cases, fatigue failure is the largest in number of occurrence. The most failed components are the disk, blade, landing gear, bearing, and fastener. The frequently failed materials consist of steel, aluminum alloy, superalloy, and titanium alloy. Therefore, in order to assure the safety in aviation, more attention should be paid to the fatigue failures.
Keywords: Aviation industry, failure analysis, failure component, fatigue.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18309445 Deactivation of Cu - Cr/γ-alumina Catalysts for Combustion of Exhaust Gases
Authors: Krasimir Ivanov, Dimitar Dimitrov, Boyan Boyanov
Abstract:
The paper relates to a catalyst, comprising copperchromium spinel, coated on carrier γ-Al2O3. The effect of preparation conditions on the active component composition and activity behavior of the catalysts is discussed. It was found that the activity of carbon monoxide, DME, formaldehyde and methanol oxidation reaches a maximum at an active component content of 20 – 30 wt. %. Temperature calcination at 500oC seems to be optimal for the γ– alumina supported CuO-Cr2O3 catalysts for CO, DME, formaldehyde and methanol oxidation. A three months industrial experiment was carried out to elucidate the changes in the catalyst composition during industrial exploitation of the catalyst and the main reasons for catalyst deactivation. It was concluded that the CuO–Cr2O3/γ–alumina supported catalysts have enhanced activity toward CO, DME, formaldehyde and methanol oxidation and that these catalysts are suitable for industrial application. The main reason for catalyst deactivation seems to be the deposition of iron and molybdenum, coming from the main reactor, on the active component surface.Keywords: catalyst deactivation, CuO-Cr2O3 catalysts, deep oxidation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45129444 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition
Authors: Yalong Jiang, Zheru Chi
Abstract:
In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.Keywords: CNN, capsule network, capacity optimization, character recognition, data augmentation; semantic segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7019443 Numerical Investigation of Aerodynamic Analysis on Passenger Vehicle
Authors: Cafer Görkem Pınar, İlker Coşar, Serkan Uzun, Atahan Çelebi, Mehmet Ali Ersoy, Ali Pınarbaşı
Abstract:
In this study, it was numerically investigated that a 1:1 scale model of the Renault Clio MK4 SW brand vehicle aerodynamic analysis was performed in the commercial computational fluid dynamics (CFD) package program of ANSYS CFX 2021 R1 under steady, subsonic, and 3-D conditions. The model of vehicle used for the analysis was made independent of the number of mesh elements and the k-epsilon turbulence model was applied during the analysis. Results were interpreted as streamlines, pressure gradient, and turbulent kinetic energy contours around the vehicle at 50 km/h and 100 km/h speeds. In addition, the validity of the analysis was decided by comparing the drag coefficient of the vehicle with the values in the literature. As a result, the pressure gradient contours of the taillight of the Renault Clio MK4 SW vehicle were examined and the behavior of the total force at speeds of 50 km/h and 100 km/h was interpreted.
Keywords: CFD, k-epsilon, aerodynamics, drag coefficient, taillight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4529442 Bearing Condition Monitoring with Acoustic Emission Techniques
Authors: Faisal AlShammari, Abdulmajid Addali
Abstract:
Monitoring the conditions of rotating machinery, such as bearings, is important in order to improve the stability of work. Acoustic Emission (AE) and vibration analysis are some of the most accomplished techniques used for this purpose. Acoustic emission has the ability to detect the initial phase of component degradation. Moreover, it has been observed that vibration analysis is not as successful at low rotational speeds (below 100 rpm). This because the energy generated within this speed region is not detectable using conventional vibration. From this perspective, this paper has presented a brief review of using acoustic emission techniques for monitoring bearing conditions.
Keywords: Condition monitoring, stress wave analysis, low-speed bearings, bearing defect diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38019441 Dimensionality Reduction in Modal Analysis for Structural Health Monitoring
Authors: Elia Favarelli, Enrico Testi, Andrea Giorgetti
Abstract:
Autonomous structural health monitoring (SHM) of many structures and bridges became a topic of paramount importance for maintenance purposes and safety reasons. This paper proposes a set of machine learning (ML) tools to perform automatic feature selection and detection of anomalies in a bridge from vibrational data and compare different feature extraction schemes to increase the accuracy and reduce the amount of data collected. As a case study, the Z-24 bridge is considered because of the extensive database of accelerometric data in both standard and damaged conditions. The proposed framework starts from the first four fundamental frequencies extracted through operational modal analysis (OMA) and clustering, followed by time-domain filtering (tracking). The fundamental frequencies extracted are then fed to a dimensionality reduction block implemented through two different approaches: feature selection (intelligent multiplexer) that tries to estimate the most reliable frequencies based on the evaluation of some statistical features (i.e., entropy, variance, kurtosis), and feature extraction (auto-associative neural network (ANN)) that combine the fundamental frequencies to extract new damage sensitive features in a low dimensional feature space. Finally, one-class classification (OCC) algorithms perform anomaly detection, trained with standard condition points, and tested with normal and anomaly ones. In particular, principal component analysis (PCA), kernel principal component analysis (KPCA), and autoassociative neural network (ANN) are presented and their performance are compared. It is also shown that, by evaluating the correct features, the anomaly can be detected with accuracy and an F1 score greater than 95%.
Keywords: Anomaly detection, dimensionality reduction, frequencies selection, modal analysis, neural network, structural health monitoring, vibration measurement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 709