Search results for: component; case based reasoning (CBR)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13874

Search results for: component; case based reasoning (CBR)

13724 Local Curvelet Based Classification Using Linear Discriminant Analysis for Face Recognition

Authors: Mohammed Rziza, Mohamed El Aroussi, Mohammed El Hassouni, Sanaa Ghouzali, Driss Aboutajdine

Abstract:

In this paper, an efficient local appearance feature extraction method based the multi-resolution Curvelet transform is proposed in order to further enhance the performance of the well known Linear Discriminant Analysis(LDA) method when applied to face recognition. Each face is described by a subset of band filtered images containing block-based Curvelet coefficients. These coefficients characterize the face texture and a set of simple statistical measures allows us to form compact and meaningful feature vectors. The proposed method is compared with some related feature extraction methods such as Principal component analysis (PCA), as well as Linear Discriminant Analysis LDA, and independent component Analysis (ICA). Two different muti-resolution transforms, Wavelet (DWT) and Contourlet, were also compared against the Block Based Curvelet-LDA algorithm. Experimental results on ORL, YALE and FERET face databases convince us that the proposed method provides a better representation of the class information and obtains much higher recognition accuracies.

Keywords: Curvelet, Linear Discriminant Analysis (LDA) , Contourlet, Discreet Wavelet Transform, DWT, Block-based analysis, face recognition (FR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1771
13723 Efficient Real-time Remote Data Propagation Mechanism for a Component-Based Approach to Distributed Manufacturing

Authors: V. Barot, S. McLeod, R. Harrison, A. A. West

Abstract:

Manufacturing Industries face a crucial change as products and processes are required to, easily and efficiently, be reconfigurable and reusable. In order to stay competitive and flexible, situations also demand distribution of enterprises globally, which requires implementation of efficient communication strategies. A prototype system called the “Broadcaster" has been developed with an assumption that the control environment description has been engineered using the Component-based system paradigm. This prototype distributes information to a number of globally distributed partners via an adoption of the circular-based data processing mechanism. The work highlighted in this paper includes the implementation of this mechanism in the domain of the manufacturing industry. The proposed solution enables real-time remote propagation of machine information to a number of distributed supply chain client resources such as a HMI, VRML-based 3D views and remote client instances regardless of their distribution nature and/ or their mechanisms. This approach is presented together with a set of evaluation results. Authors- main concentration surrounds the reliability and the performance metric of the adopted approach. Performance evaluation is carried out in terms of the response times taken to process the data in this domain and compared with an alternative data processing implementation such as the linear queue mechanism. Based on the evaluation results obtained, authors justify the benefits achieved from this proposed implementation and highlight any further research work that is to be carried out.

Keywords: Broadcaster, circular buffer, Component-based, distributed manufacturing, remote data propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1339
13722 Logic Programming and Artificial Neural Networks in Pharmacological Screening of Schinus Essential Oils

Authors: José Neves, M. Rosário Martins, Fátima Candeias, Diana Ferreira, Sílvia Arantes, Júlio Cruz-Morais, Guida Gomes, Joaquim Macedo, António Abelha, Henrique Vicente

Abstract:

Some plants of genus Schinus have been used in the folk medicine as topical antiseptic, digestive, purgative, diuretic, analgesic or antidepressant, and also for respiratory and urinary infections. Chemical composition of essential oils of S. molle and S. terebinthifolius had been evaluated and presented high variability according with the part of the plant studied and with the geographic and climatic regions. The pharmacological properties, namely antimicrobial, anti-tumoural and anti-inflammatory activities are conditioned by chemical composition of essential oils. Taking into account the difficulty to infer the pharmacological properties of Schinus essential oils without hard experimental approach, this work will focus on the development of a decision support system, in terms of its knowledge representation and reasoning procedures, under a formal framework based on Logic Programming, complemented with an approach to computing centered on Artificial Neural Networks and the respective Degree-of-Confidence that one has on such an occurrence.

Keywords: Artificial neuronal networks, essential oils, knowledge representation and reasoning, logic programming, Schinus molle L, Schinus terebinthifolius raddi.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2378
13721 Business Intelligence for N=1 Analytics using Hybrid Intelligent System Approach

Authors: Rajendra M Sonar

Abstract:

The future of business intelligence (BI) is to integrate intelligence into operational systems that works in real-time analyzing small chunks of data based on requirements on continuous basis. This is moving away from traditional approach of doing analysis on ad-hoc basis or sporadically in passive and off-line mode analyzing huge amount data. Various AI techniques such as expert systems, case-based reasoning, neural-networks play important role in building business intelligent systems. Since BI involves various tasks and models various types of problems, hybrid intelligent techniques can be better choice. Intelligent systems accessible through web services make it easier to integrate them into existing operational systems to add intelligence in every business processes. These can be built to be invoked in modular and distributed way to work in real time. Functionality of such systems can be extended to get external inputs compatible with formats like RSS. In this paper, we describe a framework that use effective combinations of these techniques, accessible through web services and work in real-time. We have successfully developed various prototype systems and done few commercial deployments in the area of personalization and recommendation on mobile and websites.

Keywords: Business Intelligence, Customer Relationship Management, Hybrid Intelligent Systems, Personalization and Recommendation (P&R), Recommender Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2035
13720 Implementation of TinyHash based on Hash Algorithm for Sensor Network

Authors: HangRok Lee, YongJe Choi, HoWon Kim

Abstract:

In recent years, it has been proposed security architecture for sensor network.[2][4]. One of these, TinySec by Chris Kalof, Naveen Sastry, David Wagner had proposed Link layer security architecture, considering some problems of sensor network. (i.e : energy, bandwidth, computation capability,etc). The TinySec employs CBC_mode of encryption and CBC-MAC for authentication based on SkipJack Block Cipher. Currently, This TinySec is incorporated in the TinyOS for sensor network security. This paper introduces TinyHash based on general hash algorithm. TinyHash is the module in order to replace parts of authentication and integrity in the TinySec. it implies that apply hash algorithm on TinySec architecture. For compatibility about TinySec, Components in TinyHash is constructed as similar structure of TinySec. And TinyHash implements the HMAC component for authentication and the Digest component for integrity of messages. Additionally, we define the some interfaces for service associated with hash algorithm.

Keywords: sensor network security, nesC, TinySec, TinyOS, Hash, HMAC, integrity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2314
13719 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences

Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam

Abstract:

The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.

Keywords: Learning experiences, innovation, traditional games, trainee teachers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2408
13718 Performance Analysis of the Time-Based and Periodogram-Based Energy Detector for Spectrum Sensing

Authors: Sadaf Nawaz, Adnan Ahmed Khan, Asad Mahmood, Chaudhary Farrukh Javed

Abstract:

Classically, an energy detector is implemented in time domain (TD). However, frequency domain (FD) based energy detector has demonstrated an improved performance. This paper presents a comparison between the two approaches as to analyze their pros and cons. A detailed performance analysis of the classical TD energy-detector and the periodogram based detector is performed. Exact and approximate mathematical expressions for probability of false alarm (Pf) and probability of detection (Pd) are derived for both approaches. The derived expressions naturally lead to an analytical as well as intuitive reasoning for the improved performance of (Pf) and (Pd) in different scenarios. Our analysis suggests the dependence improvement on buffer sizes. Pf is improved in FD, whereas Pd is enhanced in TD based energy detectors. Finally, Monte Carlo simulations results demonstrate the analysis reached by the derived expressions.

Keywords: Cognitive radio, energy detector, periodogram, spectrum sensing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 991
13717 Reliability-Based Life-Cycle Cost Model for Engineering Systems

Authors: Reza Lotfalian, Sudarshan Martins, Peter Radziszewski

Abstract:

The effect of reliability on life-cycle cost, including initial and maintenance cost of a system is studied. The failure probability of a component is used to calculate the average maintenance cost during the operation cycle of the component. The standard deviation of the life-cycle cost is also calculated as an error measure for the average life-cycle cost. As a numerical example, the model is used to study the average life-cycle cost of an electric motor.

Keywords: Initial Cost, Life-cycle cost, Maintenance Cost, Reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173
13716 Blind Source Separation based on the Estimation for the Number of the Blind Sources under a Dynamic Acoustic Environment

Authors: Takaaki Ishibashi

Abstract:

Independent component analysis can estimate unknown source signals from their mixtures under the assumption that the source signals are statistically independent. However, in a real environment, the separation performance is often deteriorated because the number of the source signals is different from that of the sensors. In this paper, we propose an estimation method for the number of the sources based on the joint distribution of the observed signals under two-sensor configuration. From several simulation results, it is found that the number of the sources is coincident to that of peaks in the histogram of the distribution. The proposed method can estimate the number of the sources even if it is larger than that of the observed signals. The proposed methods have been verified by several experiments.

Keywords: blind source separation, independent component analysys, estimation for the number of the blind sources, voice activity detection, target extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1261
13715 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence (AI) is invaluable in identifying crime. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISAs). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The proposed framework development is implemented using the Java Agent Development Framework, Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISAs and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5% of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: Artificial intelligence, computer science, criminal investigation, digital forensics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1110
13714 Synergy in Vertical Transformations of Expert Designers

Authors: G. Haupt

Abstract:

Existing literature ondesign reasoning seems to give either one sided accounts on expert design behaviour based on internal processing. In the same way ecological theoriesseem to focus one sidedly on external elementsthat result in a lack of unifying design cognition theory. Although current extended design cognition studies acknowledge the intellectual interaction between internal and external resources, there still seems to be insufficient understanding of the complexities involved in such interactive processes. As such,this paper proposes a novelmulti-directional model for design researchers tomap the complex and dynamic conduct controlling behaviour in which both the computational and ecological perspectives are integrated in a vertical manner. A clear distinction between identified intentional and emerging physical drivers, and relationships between them during the early phases of experts- design process, is demonstrated by presenting a case study in which the model was employed.

Keywords: External representation, early phases, extended design cognition, internal processes and external drivers, conduct controlling behaviour.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1208
13713 A Collaborative Platform for Multilingual Ontology Development

Authors: Ahmed Tawfik, Fausto Giunchiglia, Vincenzo Maltese

Abstract:

Ontologies provide a common understanding of a specific domain of interest that can be communicated between people and used as background knowledge for automated reasoning in a wide range of applications. In this paper, we address the design of multilingual ontologies following well-defined knowledge engineering methodologies with the support of novel collaborative development approaches. In particular, we present a collaborative platform which allows ontologies to be developed incrementally in multiple languages. This is made possible via an appropriate mapping between language independent concepts and one lexicalization per language (or a lexical gap in case such lexicalization does not exist). The collaborative platform has been designed to support the development of the Universal Knowledge Core, a multilingual ontology currently in English, Italian, Chinese, Mongolian, Hindi and Bangladeshi. Its design follows a workflow-based development methodology that models resources as a set of collaborative objects and assigns customizable workflows to build and maintain each collaborative object in a community driven manner, with extensive support of modern web 2.0 social and collaborative features.

Keywords: Knowledge Diversity, Knowledge Representation, Ontology Development.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173
13712 Multi-Objective Optimization in End Milling of Al-6061 Using Taguchi Based G-PCA

Authors: M. K. Pradhan, Mayank Meena, Shubham Sen, Arvind Singh

Abstract:

In this study, a multi objective optimization for end milling of Al 6061 alloy has been presented to provide better surface quality and higher Material Removal Rate (MRR). The input parameters considered for the analysis are spindle speed, depth of cut and feed. The experiments were planned as per Taguchis design of experiment, with L27 orthogonal array. The Grey Relational Analysis (GRA) has been used for transforming multiple quality responses into a single response and the weights of the each performance characteristics are determined by employing the Principal Component Analysis (PCA), so that their relative importance can be properly and objectively described. The results reveal that Taguchi based G-PCA can effectively acquire the optimal combination of cutting parameters.

Keywords: Material Removal Rate, Surface Roughness, Taguchi Method, Grey Relational Analysis, Principal Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
13711 Support Vector Fuzzy Based Neural Networks For Exchange Rate Modeling

Authors: Prof. Chokri SLIM

Abstract:

A Novel fuzzy neural network combining with support vector learning mechanism called support-vector-based fuzzy neural networks (SVBFNN) is proposed. The SVBFNN combine the capability of minimizing the empirical risk (training error) and expected risk (testing error) of support vector learning in high dimensional data spaces and the efficient human-like reasoning of FNN.

Keywords: Neural network, fuzzy inference, machine learning, fuzzy modeling and rule extraction, support vector regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16646
13710 RTCoord: A Methodology to Design WSAN Applications

Authors: J. Barbarán, M. Díaz, I. Esteve, D. Garrido, L. Llopis, B. Rubio

Abstract:

Wireless Sensor and Actor Networks (WSANs) constitute an emerging and pervasive technology that is attracting increasing interest in the research community for a wide range of applications. WSANs have two important requirements: coordination interactions and real-time communication to perform correct and timely actions. This paper introduces a methodology to facilitate the task of the application programmer focusing on the coordination and real-time requirements of WSANs. The methodology proposed in this model uses a real-time component model, UM-RTCOM, which will help us to achieve the design and implementation of applications in WSAN by using the component oriented paradigm. This will help us to develop software components which offer some very interesting features, such as reusability and adaptability which are very suitable for WSANs as they are very dynamic environments with rapidly changing conditions. In addition, a high-level coordination model based on tuple channels (TC-WSAN) is integrated into the methodology by providing a component-based specification of this model in UM-RTCOM; this will allow us to satisfy both sensor-actor and actor-actor coordination requirements in WSANs. Finally, we present in this paper the design and implementation of an application which will help us to show how the methodology can be easily used in order to achieve the development of WSANs applications.

Keywords: Sensor networks, real time and embedded systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1267
13709 Diagnosis of Ovarian Cancer with Proteomic Patterns in Serum using Independent Component Analysis and Neural Networks

Authors: Simone C. F. Neves, Lúcio F. A. Campos, Ewaldo Santana, Ginalber L. O. Serra, Allan K. Barros

Abstract:

We propose a method for discrimination and classification of ovarian with benign, malignant and normal tissue using independent component analysis and neural networks. The method was tested for a proteomic patters set from A database, and radial basis functions neural networks. The best performance was obtained with probabilistic neural networks, resulting I 99% success rate, with 98% of specificity e 100% of sensitivity.

Keywords: Cancer ovarian, Proteomic patterns in serum, independent component analysis and neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1791
13708 Modeling the Impact of Controls on Information System Risks

Authors: M. Ndaw, G. Mendy, S. Ouya

Abstract:

Information system risk management helps to reduce or eliminate risk by implementing appropriate controls. In this paper, we propose a quantification model of controls impact on information system risks by automatizing the residual criticality estimation step of FMECA which is based on a inductive reasoning. For this, we defined three equations based on type and maturity of controls. For testing, the values obtained with the model were compared to estimated values given by interlocutors during different working sessions and the result is satisfactory. This model allows an optimal assessment of controls maturity and facilitates risk analysis of information system.

Keywords: Information System, Risk, Control, FMECA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1530
13707 Real Time Acquisition and Analysis of Neural Response for Rehabilitative Control

Authors: Dipali Bansal, Rashima Mahajan, Shweta Singh, Dheeraj Rathee, Sujit Roy

Abstract:

Non-invasive Brain Computer Interface like Electroencephalography (EEG) which directly taps neurological signals, is being widely explored these days to connect paralytic patients/elderly with the external environment. However, in India the research is confined to laboratory settings and is not reaching the mass for rehabilitation purposes. An attempt has been made in this paper to analyze real time acquired EEG signal using cost effective and portable headset unit EMOTIV. Signal processing of real time acquired EEG is done using EEGLAB in MATLAB and EDF Browser application software platforms. Independent Component Analysis algorithm of EEGLAB is explored to identify deliberate eye blink in the attained neural signal. Time Frequency transforms and Data statistics obtained using EEGLAB along with component activation results of EDF browser clearly indicate voluntary eye blink in AF3 channel. The spectral analysis indicates dominant frequency component at 1.536000Hz representing the delta wave component of EEG during voluntary eye blink action. An algorithm is further designed to generate an active high signal based on thoughtful eye blink that can be used for plethora of control applications for rehabilitation.

Keywords: Brain Computer Interface, EDF Browser, EEG, EEGLab, EMOTIV, Real time Acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3199
13706 Template-Based Object Detection through Partial Shape Matching and Boundary Verification

Authors: Feng Ge, Tiecheng Liu, Song Wang, Joachim Stahl

Abstract:

This paper presents a novel template-based method to detect objects of interest from real images by shape matching. To locate a target object that has a similar shape to a given template boundary, the proposed method integrates three components: contour grouping, partial shape matching, and boundary verification. In the first component, low-level image features, including edges and corners, are grouped into a set of perceptually salient closed contours using an extended ratio-contour algorithm. In the second component, we develop a partial shape matching algorithm to identify the fractions of detected contours that partly match given template boundaries. Specifically, we represent template boundaries and detected contours using landmarks, and apply a greedy algorithm to search the matched landmark subsequences. For each matched fraction between a template and a detected contour, we estimate an affine transform that transforms the whole template into a hypothetic boundary. In the third component, we provide an efficient algorithm based on oriented edge lists to determine the target boundary from the hypothetic boundaries by checking each of them against image edges. We evaluate the proposed method on recognizing and localizing 12 template leaves in a data set of real images with clutter back-grounds, illumination variations, occlusions, and image noises. The experiments demonstrate the high performance of our proposed method1.

Keywords: Object detection, shape matching, contour grouping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2255
13705 Statistical Wavelet Features, PCA, and SVM Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the supportvectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: Discrete Wavelet Transform, Electroencephalogram, Pattern Recognition, Principal Component Analysis, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3068
13704 A Taguchi Approach to Investigate Impact of Factors for Reusability of Software Components

Authors: Parvinder S. Sandhu, Pavel Blecharz, Hardeep Singh

Abstract:

Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.

Keywords: Taguchi Approach, Reusability, SoftwareComponents, Structural Attributes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1684
13703 Evaluation Process for the Hardware Safety Integrity Level

Authors: Sung Kyu Kim, Yong Soo Kim

Abstract:

Safety instrumented systems (SISs) are becoming increasingly complex and the proportion of programmable electronic parts is growing. The IEC 61508 global standard was established to ensure the functional safety of SISs, but it was expressed in highly macroscopic terms. This study introduces an evaluation process for hardware safety integrity levels through failure modes, effects, and diagnostic analysis (FMEDA).FMEDA is widely used to evaluate safety levels, and it provides the information on failure rates and failure mode distributions necessary to calculate a diagnostic coverage factor for a given component. In our evaluation process, the components of the SIS subsystem are first defined in terms of failure modes and effects. Then, the failure rate and failure mechanism distribution are assigned to each component. The safety mode and detectability of each failure mode are determined for each component. Finally, the hardware safety integrity level is evaluated based on the calculated results.

Keywords: Safety instrumented system; Safety integrity level; Failure modes, effects, and diagnostic analysis; IEC 61508.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2485
13702 Prediction of Reusability of Object Oriented Software Systems using Clustering Approach

Authors: Anju Shri, Parvinder S. Sandhu, Vikas Gupta, Sanyam Anand

Abstract:

In literature, there are metrics for identifying the quality of reusable components but the framework that makes use of these metrics to precisely predict reusability of software components is still need to be worked out. These reusability metrics if identified in the design phase or even in the coding phase can help us to reduce the rework by improving quality of reuse of the software component and hence improve the productivity due to probabilistic increase in the reuse level. As CK metric suit is most widely used metrics for extraction of structural features of an object oriented (OO) software; So, in this study, tuned CK metric suit i.e. WMC, DIT, NOC, CBO and LCOM, is used to obtain the structural analysis of OO-based software components. An algorithm has been proposed in which the inputs can be given to K-Means Clustering system in form of tuned values of the OO software component and decision tree is formed for the 10-fold cross validation of data to evaluate the in terms of linguistic reusability value of the component. The developed reusability model has produced high precision results as desired.

Keywords: CK-Metric, Desicion Tree, Kmeans, Reusability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1878
13701 A Diagnostic Fuzzy Rule-Based System for Congenital Heart Disease

Authors: Ersin Kaya, Bulent Oran, Ahmet Arslan

Abstract:

In this study, fuzzy rule-based classifier is used for the diagnosis of congenital heart disease. Congenital heart diseases are defined as structural or functional heart disease. Medical data sets were obtained from Pediatric Cardiology Department at Selcuk University, from years 2000 to 2003. Firstly, fuzzy rules were generated by using medical data. Then the weights of fuzzy rules were calculated. Two different reasoning methods as “weighted vote method" and “singles winner method" were used in this study. The results of fuzzy classifiers were compared.

Keywords: Congenital heart disease, Fuzzy rule-basedclassifiers, Classification

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1773
13700 The Use of Degradation Measures to Design Reliability Test Plans

Authors: Stephen V. Crowder, Jonathan W. Lane

Abstract:

With short production development times, there is an increased need to demonstrate product reliability relatively quickly with minimal testing. In such cases there may be few if any observed failures. Thus it may be difficult to assess reliability using the traditional reliability test plans that measure only time (or cycles) to failure. For many components, degradation measures will contain important information about performance and reliability. These measures can be used to design a minimal test plan, in terms of number of units placed on test and duration of the test, necessary to demonstrate a reliability goal. In this work we present a case study involving an electronic component subject to degradation. The data, consisting of 42 degradation paths of cycles to failure, are first used to estimate a reliability function. Bootstrapping techniques are then used to perform power studies and develop a minimal reliability test plan for future production of this component. 

Keywords: Degradation Measure, Time to Failure Distribution, Bootstrap.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838
13699 An Experimental Comparison of Unsupervised Learning Techniques for Face Recognition

Authors: Dinesh Kumar, C.S. Rai, Shakti Kumar

Abstract:

Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.

Keywords: Face Recognition, Principal Component Analysis, Self Organizing Maps, Independent Component Analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1826
13698 Human Walking Vertical Force and Vertical Vibration of Pedestrian Bridge Induced by Its Higher Components

Authors: M. Yoneda

Abstract:

The purpose of this study is to identify human walking vertical force by using FFT power spectrum density from the experimental acceleration data of the human body. An experiment on human walking is carried out on a stationary floor especially paying attention to higher components of dynamic vertical walking force. Based on measured acceleration data of the human lumbar part, not only in-phase component with frequency of 2fw, 3fw, but also in-opposite-phase component with frequency of 0.5 fw, 1.5 fw, 2.5 fw where fw is the walking rate is observed. The vertical vibration of pedestrian bridge induced by higher components of human walking vertical force is also discussed in this paper. A full scale measurement for the existing pedestrian bridge with center span length of 33 m is carried out focusing on the resonance phenomenon due to higher components of human walking vertical force. Dynamic response characteristics excited by these vertical higher components of human walking are revealed from the dynamic design viewpoint of pedestrian bridge.

Keywords: Simplified method, Human walking vertical force, Higher component, Pedestrian bridge vibration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1780
13697 Face Recognition with PCA and KPCA using Elman Neural Network and SVM

Authors: Hossein Esbati, Jalil Shirazi

Abstract:

In this paper, in order to categorize ORL database face pictures, principle Component Analysis (PCA) and Kernel Principal Component Analysis (KPCA) methods by using Elman neural network and Support Vector Machine (SVM) categorization methods are used. Elman network as a recurrent neural network is proposed for modeling storage systems and also it is used for reviewing the effect of using PCA numbers on system categorization precision rate and database pictures categorization time. Categorization stages are conducted with various components numbers and the obtained results of both Elman neural network categorization and support vector machine are compared. In optimum manner 97.41% recognition accuracy is obtained.

Keywords: Face recognition, Principal Component Analysis, Kernel Principal Component Analysis, Neural network, Support Vector Machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889
13696 Identification of Reusable Software Modules in Function Oriented Software Systems using Neural Network Based Technique

Authors: Sonia Manhas, Parvinder S. Sandhu, Vinay Chopra, Nirvair Neeru

Abstract:

The cost of developing the software from scratch can be saved by identifying and extracting the reusable components from already developed and existing software systems or legacy systems [6]. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. We have used metric based approach for characterizing a software module. In this present work, the metrics McCabe-s Cyclometric Complexity Measure for Complexity measurement, Regularity Metric, Halstead Software Science Indicator for Volume indication, Reuse Frequency metric and Coupling Metric values of the software component are used as input attributes to the different types of Neural Network system and reusability of the software component is calculated. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE).

Keywords: Software reusability, Neural Networks, MAE, RMSE, Accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1829
13695 Application of Argumentation for Improving the Classification Accuracy in Inductive Concept Formation

Authors: Vadim Vagin, Marina Fomina, Oleg Morosin

Abstract:

This paper contains the description of argumentation approach for the problem of inductive concept formation. It is proposed to use argumentation, based on defeasible reasoning with justification degrees, to improve the quality of classification models, obtained by generalization algorithms. The experiment’s results on both clear and noisy data are also presented.

Keywords: Argumentation, justification degrees, inductive concept formation, noise, generalization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1571