Search results for: component analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9123

Search results for: component analysis

9093 Identifying Missing Component in the Bechdel Test Using Principal Component Analysis Method

Authors: Raghav Lakhotia, Chandra Kanth Nagesh, Krishna Madgula

Abstract:

A lot has been said and discussed regarding the rationale and significance of the Bechdel Score. It became a digital sensation in 2013, when Swedish cinemas began to showcase the Bechdel test score of a film alongside its rating. The test has drawn criticism from experts and the film fraternity regarding its use to rate the female presence in a movie. The pundits believe that the score is too simplified and the underlying criteria of a film to pass the test must include 1) at least two women, 2) who have at least one dialogue, 3) about something other than a man, is egregious. In this research, we have considered a few more parameters which highlight how we represent females in film, like the number of female dialogues in a movie, dialogue genre, and part of speech tags in the dialogue. The parameters were missing in the existing criteria to calculate the Bechdel score. The research aims to analyze 342 movies scripts to test a hypothesis if these extra parameters, above with the current Bechdel criteria, are significant in calculating the female representation score. The result of the Principal Component Analysis method concludes that the female dialogue content is a key component and should be considered while measuring the representation of women in a work of fiction.

Keywords: Bechdel test, dialogue genre, parts of speech tags, principal component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 753
9092 Toward an Architecture of a Component-Based System Supporting Separation of Non- Functional Concerns

Authors: Jerzy Nogiec, Kelley Trombly-Freytag, Shangping Ren

Abstract:

The promises of component-based technology can only be fully realized when the system contains in its design a necessary level of separation of concerns. The authors propose to focus on the concerns that emerge throughout the life cycle of the system and use them as an architectural foundation for the design of a component-based framework. The proposed model comprises a set of superimposed views of the system describing its functional and non-functional concerns. This approach is illustrated by the design of a specific framework for data analysis and data acquisition and supplemented with experiences from using the systems developed with this framework at the Fermi National Accelerator Laboratory.

Keywords: Distributed system, component-based technology, separation of concerns, software development, supervisory and control, QoS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1283
9091 Analytical Study of Component Based Software Engineering

Authors: Iqbaldeep Kaur, Parvinder S. Sandhu, Hardeep Singh, Vandana Saini

Abstract:

This paper is a survey of current component-based software technologies and the description of promotion and inhibition factors in CBSE. The features that software components inherit are also discussed. Quality Assurance issues in componentbased software are also catered to. The feat research on the quality model of component based system starts with the study of what the components are, CBSE, its development life cycle and the pro & cons of CBSE. Various attributes are studied and compared keeping in view the study of various existing models for general systems and CBS. When illustrating the quality of a software component an apt set of quality attributes for the description of the system (or components) should be selected. Finally, the research issues that can be extended are tabularized.

Keywords: Component, COTS, Component based development, Component-based Software Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2696
9090 Quantitative Ranking Evaluation of Wine Quality

Authors: A. Brunel, A. Kernevez, F. Leclere, J. Trenteseaux

Abstract:

Today, wine quality is only evaluated by wine experts with their own different personal tastes, even if they may agree on some common features. So producers do not have any unbiased way to independently assess the quality of their products. A tool is here proposed to evaluate wine quality by an objective ranking based upon the variables entering wine elaboration, and analysed through principal component analysis (PCA) method. Actual climatic data are compared by measuring the relative distance between each considered wine, out of which the general ranking is performed.

Keywords: Wine, grape, vine, weather conditions, rating, climate, principal component analysis, metric analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2089
9089 On-line Testing of Software Components for Diagnosis of Embedded Systems

Authors: Thi-Quynh Bui, Oum-El-Kheir Aktouf

Abstract:

This paper studies the dependability of componentbased applications, especially embedded ones, from the diagnosis point of view. The principle of the diagnosis technique is to implement inter-component tests in order to detect and locate the faulty components without redundancy. The proposed approach for diagnosing faulty components consists of two main aspects. The first one concerns the execution of the inter-component tests which requires integrating test functionality within a component. This is the subject of this paper. The second one is the diagnosis process itself which consists of the analysis of inter-component test results to determine the fault-state of the whole system. Advantage of this diagnosis method when compared to classical redundancy faulttolerant techniques are application autonomy, cost-effectiveness and better usage of system resources. Such advantage is very important for many systems and especially for embedded ones.

Keywords: Dependability, diagnosis, middlewares, embeddedsystems, fault tolerance, inter-component testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1711
9088 Semi-Automatic Artifact Rejection Procedure Based on Kurtosis, Renyi's Entropy and Independent Component Scalp Maps

Authors: Antonino Greco, Nadia Mammone, Francesco Carlo Morabito, Mario Versaci

Abstract:

Artifact rejection plays a key role in many signal processing applications. The artifacts are disturbance that can occur during the signal acquisition and that can alter the analysis of the signals themselves. Our aim is to automatically remove the artifacts, in particular from the Electroencephalographic (EEG) recordings. A technique for the automatic artifact rejection, based on the Independent Component Analysis (ICA) for the artifact extraction and on some high order statistics such as kurtosis and Shannon-s entropy, was proposed some years ago in literature. In this paper we try to enhance this technique proposing a new method based on the Renyi-s entropy. The performance of our method was tested and compared to the performance of the method in literature and the former proved to outperform the latter.

Keywords: Artifact, EEG, Renyi's entropy, kurtosis, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1811
9087 Fuzzy Multi-Component DEA with Shared and Undesirable Fuzzy Resources

Authors: Jolly Puri, Shiv Prasad Yadav

Abstract:

Multi-component data envelopment analysis (MC-DEA) is a popular technique for measuring aggregate performance of the decision making units (DMUs) along with their components. However, the conventional MC-DEA is limited to crisp input and output data which may not always be available in exact form. In real life problems, data may be imprecise or fuzzy. Therefore, in this paper, we propose (i) a fuzzy MC-DEA (FMC-DEA) model in which shared and undesirable fuzzy resources are incorporated, (ii) the proposed FMC-DEA model is transformed into a pair of crisp models using α cut approach, (iii) fuzzy aggregate performance of a DMU and fuzzy efficiencies of components are defined to be fuzzy numbers, and (iv) a numerical example is illustrated to validate the proposed approach.

Keywords: Multi-component DEA, fuzzy multi-component DEA, fuzzy resources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2030
9086 M-band Wavelet and Cosine Transform Based Watermark Algorithm Using Randomization and Principal Component Analysis

Authors: Tong Liu, Xuan Xu, Xiaodi Wang

Abstract:

Computational techniques derived from digital image processing are playing a significant role in the security and digital copyrights of multimedia and visual arts. This technology has the effect within the domain of computers. This research presents discrete M-band wavelet transform (MWT) and cosine transform (DCT) based watermarking algorithm by incorporating the principal component analysis (PCA). The proposed algorithm is expected to achieve higher perceptual transparency. Specifically, the developed watermarking scheme can successfully resist common signal processing, such as geometric distortions, and Gaussian noise. In addition, the proposed algorithm can be parameterized, thus resulting in more security. To meet these requirements, the image is transformed by a combination of MWT & DCT. In order to improve the security further, we randomize the watermark image to create three code books. During the watermark embedding, PCA is applied to the coefficients in approximation sub-band. Finally, first few component bands represent an excellent domain for inserting the watermark.

Keywords: discrete M-band wavelet transform , discrete M-band wavelet transform, randomized watermark, principal component analysis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1955
9085 Performance Evaluation of Faculties of Islamic Azad University of Zahedan Branch Based-On Two-Component DEA

Authors: Ali Payan

Abstract:

The aim of this paper is to evaluate the performance of the faculties of Islamic Azad University of Zahedan Branch based on two-component (teaching and research) decision making units (DMUs) in data envelopment analysis (DEA). Nowadays it is obvious that most of the systems as DMUs do not act as a simple inputoutput structure. Instead, if they have been studied more delicately, they include network structure. University is such a network in which different sections i.e. teaching, research, students and office work as a parallel structure. They consume some inputs of university commonly and some others individually. Then, they produce both dependent and independent outputs. These DMUs are called two-component DMUs with network structure. In this paper, performance of the faculties of Zahedan branch is calculated by using relative efficiency model and also, a formula to compute relative efficiencies teaching and research components based on DEA are offered.

Keywords: Data envelopment analysis, faculties of Islamic Azad University of Zahedan branch, two-component DMUs.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
9084 The Giant Component in a Random Subgraph of a Weak Expander

Authors: Yilun Shang

Abstract:

In this paper, we investigate the appearance of the giant component in random subgraphs G(p) of a given large finite graph family Gn = (Vn, En) in which each edge is present independently with probability p. We show that if the graph Gn satisfies a weak isoperimetric inequality and has bounded degree, then the probability p under which G(p) has a giant component of linear order with some constant probability is bounded away from zero and one. In addition, we prove the probability of abnormally large order of the giant component decays exponentially. When a contact graph is modeled as Gn, our result is of special interest in the study of the spread of infectious diseases or the identification of community in various social networks.

Keywords: subgraph, expander, random graph, giant component, percolation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1648
9083 A Survey of Business Component Identification Methods and Related Techniques

Authors: Zhongjie Wang, Xiaofei Xu, Dechen Zhan

Abstract:

With deep development of software reuse, componentrelated technologies have been widely applied in the development of large-scale complex applications. Component identification (CI) is one of the primary research problems in software reuse, by analyzing domain business models to get a set of business components with high reuse value and good reuse performance to support effective reuse. Based on the concept and classification of CI, its technical stack is briefly discussed from four views, i.e., form of input business models, identification goals, identification strategies, and identification process. Then various CI methods presented in literatures are classified into four types, i.e., domain analysis based methods, cohesion-coupling based clustering methods, CRUD matrix based methods, and other methods, with the comparisons between these methods for their advantages and disadvantages. Additionally, some insufficiencies of study on CI are discussed, and the causes are explained subsequently. Finally, it is concluded with some significantly promising tendency about research on this problem.

Keywords: Business component, component granularity, component identification, reuse performance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1928
9082 Multivariate Statistical Analysis of Decathlon Performance Results in Olympic Athletes (1988-2008)

Authors: Jaebum Park, Vladimir M. Zatsiorsky

Abstract:

The performance results of the athletes competed in the 1988-2008 Olympic Games were analyzed (n = 166). The data were obtained from the IAAF official protocols. In the principal component analysis, the first three principal components explained 70% of the total variance. In the 1st principal component (with 43.1% of total variance explained) the largest factor loadings were for 100m (0.89), 400m (0.81), 110m hurdle run (0.76), and long jump (–0.72). This factor can be interpreted as the 'sprinting performance'. The loadings on the 2nd factor (15.3% of the total variance) presented a counter-intuitive throwing-jumping combination: the highest loadings were for throwing events (javelin throwing 0.76; shot put 0.74; and discus throwing 0.73) and also for jumping events (high jump 0.62; pole vaulting 0.58). On the 3rd factor (11.6% of total variance), the largest loading was for 1500 m running (0.88); all other loadings were below 0.4.

Keywords: Decathlon, principal component analysis, Olympic Games, multivariate statistical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765
9081 Nodal Load Profiles Estimation for Time Series Load Flow Using Independent Component Analysis

Authors: Mashitah Mohd Hussain, Salleh Serwan, Zuhaina Hj Zakaria

Abstract:

This paper presents a method to estimate load profile in a multiple power flow solutions for every minutes in 24 hours per day. A method to calculate multiple solutions of non linear profile is introduced. The Power System Simulation/Engineering (PSS®E) and python has been used to solve the load power flow. The result of this power flow solutions has been used to estimate the load profiles for each load at buses using Independent Component Analysis (ICA) without any knowledge of parameter and network topology of the systems. The proposed algorithm is tested with IEEE 69 test bus system represents for distribution part and the method of ICA has been programmed in MATLAB R2012b version. Simulation results and errors of estimations are discussed in this paper.

Keywords: Electrical Distribution System, Power Flow Solution, Distribution Network, Independent Component Analysis, Newton Raphson, Power System Simulation for Engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2857
9080 A New Face Recognition Method using PCA, LDA and Neural Network

Authors: A. Hossein Sahoolizadeh, B. Zargham Heidari, C. Hamid Dehghani

Abstract:

In this paper, a new face recognition method based on PCA (principal Component Analysis), LDA (Linear Discriminant Analysis) and neural networks is proposed. This method consists of four steps: i) Preprocessing, ii) Dimension reduction using PCA, iii) feature extraction using LDA and iv) classification using neural network. Combination of PCA and LDA is used for improving the capability of LDA when a few samples of images are available and neural classifier is used to reduce number misclassification caused by not-linearly separable classes. The proposed method was tested on Yale face database. Experimental results on this database demonstrated the effectiveness of the proposed method for face recognition with less misclassification in comparison with previous methods.

Keywords: Face recognition Principal component analysis, Linear discriminant analysis, Neural networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3161
9079 Multi-Objective Optimization in End Milling of Al-6061 Using Taguchi Based G-PCA

Authors: M. K. Pradhan, Mayank Meena, Shubham Sen, Arvind Singh

Abstract:

In this study, a multi objective optimization for end milling of Al 6061 alloy has been presented to provide better surface quality and higher Material Removal Rate (MRR). The input parameters considered for the analysis are spindle speed, depth of cut and feed. The experiments were planned as per Taguchis design of experiment, with L27 orthogonal array. The Grey Relational Analysis (GRA) has been used for transforming multiple quality responses into a single response and the weights of the each performance characteristics are determined by employing the Principal Component Analysis (PCA), so that their relative importance can be properly and objectively described. The results reveal that Taguchi based G-PCA can effectively acquire the optimal combination of cutting parameters.

Keywords: Material Removal Rate, Surface Roughness, Taguchi Method, Grey Relational Analysis, Principal Component Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2187
9078 Real Time Acquisition and Analysis of Neural Response for Rehabilitative Control

Authors: Dipali Bansal, Rashima Mahajan, Shweta Singh, Dheeraj Rathee, Sujit Roy

Abstract:

Non-invasive Brain Computer Interface like Electroencephalography (EEG) which directly taps neurological signals, is being widely explored these days to connect paralytic patients/elderly with the external environment. However, in India the research is confined to laboratory settings and is not reaching the mass for rehabilitation purposes. An attempt has been made in this paper to analyze real time acquired EEG signal using cost effective and portable headset unit EMOTIV. Signal processing of real time acquired EEG is done using EEGLAB in MATLAB and EDF Browser application software platforms. Independent Component Analysis algorithm of EEGLAB is explored to identify deliberate eye blink in the attained neural signal. Time Frequency transforms and Data statistics obtained using EEGLAB along with component activation results of EDF browser clearly indicate voluntary eye blink in AF3 channel. The spectral analysis indicates dominant frequency component at 1.536000Hz representing the delta wave component of EEG during voluntary eye blink action. An algorithm is further designed to generate an active high signal based on thoughtful eye blink that can be used for plethora of control applications for rehabilitation.

Keywords: Brain Computer Interface, EDF Browser, EEG, EEGLab, EMOTIV, Real time Acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3196
9077 Comparison of Power Generation Status of Photovoltaic Systems under Different Weather Conditions

Authors: Zhaojun Wang, Zongdi Sun, Qinqin Cui, Xingwan Ren

Abstract:

Based on multivariate statistical analysis theory, this paper uses the principal component analysis method, Mahalanobis distance analysis method and fitting method to establish the photovoltaic health model to evaluate the health of photovoltaic panels. First of all, according to weather conditions, the photovoltaic panel variable data are classified into five categories: sunny, cloudy, rainy, foggy, overcast. The health of photovoltaic panels in these five types of weather is studied. Secondly, a scatterplot of the relationship between the amount of electricity produced by each kind of weather and other variables was plotted. It was found that the amount of electricity generated by photovoltaic panels has a significant nonlinear relationship with time. The fitting method was used to fit the relationship between the amount of weather generated and the time, and the nonlinear equation was obtained. Then, using the principal component analysis method to analyze the independent variables under five kinds of weather conditions, according to the Kaiser-Meyer-Olkin test, it was found that three types of weather such as overcast, foggy, and sunny meet the conditions for factor analysis, while cloudy and rainy weather do not satisfy the conditions for factor analysis. Therefore, through the principal component analysis method, the main components of overcast weather are temperature, AQI, and pm2.5. The main component of foggy weather is temperature, and the main components of sunny weather are temperature, AQI, and pm2.5. Cloudy and rainy weather require analysis of all of their variables, namely temperature, AQI, pm2.5, solar radiation intensity and time. Finally, taking the variable values in sunny weather as observed values, taking the main components of cloudy, foggy, overcast and rainy weather as sample data, the Mahalanobis distances between observed value and these sample values are obtained. A comparative analysis was carried out to compare the degree of deviation of the Mahalanobis distance to determine the health of the photovoltaic panels under different weather conditions. It was found that the weather conditions in which the Mahalanobis distance fluctuations ranged from small to large were: foggy, cloudy, overcast and rainy.

Keywords: Fitting, principal component analysis, Mahalanobis distance, SPSS, MATLAB.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 617
9076 Modelling of Heating and Evaporation of Biodiesel Fuel Droplets

Authors: Mansour Al Qubeissi, Sergei S. Sazhin, Cyril Crua, Morgan R. Heikal

Abstract:

This paper presents the application of the Discrete Component Model for heating and evaporation to multi-component biodiesel fuel droplets in direct injection internal combustion engines. This model takes into account the effects of temperature gradient, recirculation and species diffusion inside droplets. A distinctive feature of the model used in the analysis is that it is based on the analytical solutions to the temperature and species diffusion equations inside the droplets. Nineteen types of biodiesel fuels are considered. It is shown that a simplistic model, based on the approximation of biodiesel fuel by a single component or ignoring the diffusion of components of biodiesel fuel, leads to noticeable errors in predicted droplet evaporation time and time evolution of droplet surface temperature and radius.

Keywords: Heat/Mass Transfer, Biodiesel, Multi-component Fuel, Droplet, Evaporation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2756
9075 Implementation of SSL Using Information Security Component Interface

Authors: Jong-Whoi Shin, Chong-Sun Hwang

Abstract:

Various security APIs (Application Programming Interfaces) are being used in a variety of application areas requiring the information security function. However, these standards are not compatible, and the developer must use those APIs selectively depending on the application environment or the programming language. To resolve this problem, we propose the standard draft of the information security component, while SSL (Secure Sockets Layer) using the confidentiality and integrity component interface has been implemented to verify validity of the standard proposal. The implemented SSL uses the lower-level SSL component when establishing the RMI (Remote Method Invocation) communication between components, as if the security algorithm had been implemented by adding one more layer on the TCP/IP.

Keywords: Component Based Design, Application Programming Interface, Secure Socket Layer, Remote Method Invocation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1466
9074 Data and Spatial Analysis for Economy and Education of 28 E.U. Member-States for 2014

Authors: Alexiou Dimitra, Fragkaki Maria

Abstract:

The objective of the paper is the study of geographic, economic and educational variables and their contribution to determine the position of each member-state among the EU-28 countries based on the values of seven variables as given by Eurostat. The Data Analysis methods of Multiple Factorial Correspondence Analysis (MFCA) Principal Component Analysis and Factor Analysis have been used. The cross tabulation tables of data consist of the values of seven variables for the 28 countries for 2014. The data are manipulated using the CHIC Analysis V 1.1 software package. The results of this program using MFCA and Ascending Hierarchical Classification are given in arithmetic and graphical form. For comparison reasons with the same data the Factor procedure of Statistical package IBM SPSS 20 has been used. The numerical and graphical results presented with tables and graphs, demonstrate the agreement between the two methods. The most important result is the study of the relation between the 28 countries and the position of each country in groups or clouds, which are formed according to the values of the corresponding variables.

Keywords: Multiple factorial correspondence analysis, principal component analysis, factor analysis, E.U.-28 countries, statistical package IBM SPSS 20, CHIC Analysis V 1.1 Software, Eurostat.eu statistics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1040
9073 Kohonen Self-Organizing Maps as a New Method for Determination of Salt Composition of Multi-Component Solutions

Authors: Sergey A. Burikov, Tatiana A. Dolenko, Kirill A. Gushchin, Sergey A. Dolenko

Abstract:

The paper presents the results of clusterization by Kohonen self-organizing maps (SOM) applied for analysis of array of Raman spectra of multi-component solutions of inorganic salts, for determination of types of salts present in the solution. It is demonstrated that use of SOM is a promising method for solution of clusterization and classification problems in spectroscopy of multicomponent objects, as attributing a pattern to some cluster may be used for recognition of component composition of the object.

Keywords: Kohonen self-organizing maps, clusterization, multicomponent solutions, Raman spectroscopy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1723
9072 An Automated Stock Investment System Using Machine Learning Techniques: An Application in Australia

Authors: Carol Anne Hargreaves

Abstract:

A key issue in stock investment is how to select representative features for stock selection. The objective of this paper is to firstly determine whether an automated stock investment system, using machine learning techniques, may be used to identify a portfolio of growth stocks that are highly likely to provide returns better than the stock market index. The second objective is to identify the technical features that best characterize whether a stock’s price is likely to go up and to identify the most important factors and their contribution to predicting the likelihood of the stock price going up. Unsupervised machine learning techniques, such as cluster analysis, were applied to the stock data to identify a cluster of stocks that was likely to go up in price – portfolio 1. Next, the principal component analysis technique was used to select stocks that were rated high on component one and component two – portfolio 2. Thirdly, a supervised machine learning technique, the logistic regression method, was used to select stocks with a high probability of their price going up – portfolio 3. The predictive models were validated with metrics such as, sensitivity (recall), specificity and overall accuracy for all models. All accuracy measures were above 70%. All portfolios outperformed the market by more than eight times. The top three stocks were selected for each of the three stock portfolios and traded in the market for one month. After one month the return for each stock portfolio was computed and compared with the stock market index returns. The returns for all three stock portfolios was 23.87% for the principal component analysis stock portfolio, 11.65% for the logistic regression portfolio and 8.88% for the K-means cluster portfolio while the stock market performance was 0.38%. This study confirms that an automated stock investment system using machine learning techniques can identify top performing stock portfolios that outperform the stock market.

Keywords: Machine learning, stock market trading, logistic principal component analysis, automated stock investment system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1031
9071 A Specification-Based Approach for Retrieval of Reusable Business Component for Software Reuse

Authors: Meng Fanchao, Zhan Dechen, Xu Xiaofei

Abstract:

Software reuse can be considered as the most realistic and promising way to improve software engineering productivity and quality. Automated assistance for software reuse involves the representation, classification, retrieval and adaptation of components. The representation and retrieval of components are important to software reuse in Component-Based on Software Development (CBSD). However, current industrial component models mainly focus on the implement techniques and ignore the semantic information about component, so it is difficult to retrieve the components that satisfy user-s requirements. This paper presents a method of business component retrieval based on specification matching to solve the software reuse of enterprise information system. First, a business component model oriented reuse is proposed. In our model, the business data type is represented as sign data type based on XML, which can express the variable business data type that can describe the variety of business operations. Based on this model, we propose specification match relationships in two levels: business operation level and business component level. In business operation level, we use input business data types, output business data types and the taxonomy of business operations evaluate the similarity between business operations. In the business component level, we propose five specification matches between business components. To retrieval reusable business components, we propose the measure of similarity degrees to calculate the similarities between business components. Finally, a business component retrieval command like SQL is proposed to help user to retrieve approximate business components from component repository.

Keywords: Business component, business operation, business data type, specification matching.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1360
9070 Spatial Distribution and Risk Assessment of As, Hg, Co and Cr in Kaveh Industrial City, using Geostatistic and GIS

Authors: Abbas Hani

Abstract:

The concentrations of As, Hg, Co, Cr and Cd were tested for each soil sample, and their spatial patterns were analyzed by the semivariogram approach of geostatistics and geographical information system technology. Multivariate statistic approaches (principal component analysis and cluster analysis) were used to identify heavy metal sources and their spatial pattern. Principal component analysis coupled with correlation between heavy metals showed that primary inputs of As, Hg and Cd were due to anthropogenic while, Co, and Cr were associated with pedogenic factors. Ordinary kriging was carried out to map the spatial patters of heavy metals. The high pollution sources evaluated was related with usage of urban and industrial wastewater. The results of this study helpful for risk assessment of environmental pollution for decision making for industrial adjustment and remedy soil pollution.

Keywords: Geographic Information system, Geostatistics, Kaveh, Multivariate Statistical Analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1923
9069 Interoperability in Component Based Software Development

Authors: M. Madiajagan, B. Vijayakumar

Abstract:

The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.

Keywords: Interoperability, component packaging, communication technology, heterogeneous platform, component interface, middleware.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2741
9068 Incremental Learning of Independent Topic Analysis

Authors: Takahiro Nishigaki, Katsumi Nitta, Takashi Onoda

Abstract:

In this paper, we present a method of applying Independent Topic Analysis (ITA) to increasing the number of document data. The number of document data has been increasing since the spread of the Internet. ITA was presented as one method to analyze the document data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis (ICA). ICA is a technique in the signal processing; however, it is difficult to apply the ITA to increasing number of document data. Because ITA must use the all document data so temporal and spatial cost is very high. Therefore, we present Incremental ITA which extracts the independent topics from increasing number of document data. Incremental ITA is a method of updating the independent topics when the document data is added after extracted the independent topics from a just previous the data. In addition, Incremental ITA updates the independent topics when the document data is added. And we show the result applied Incremental ITA to benchmark datasets.

Keywords: Text mining, topic extraction, independent, incremental, independent component analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1003
9067 A Quantitative Approach to Strategic Design of Component-Based Business Process Models

Authors: Eakong Atiptamvaree, Twittie Senivongse

Abstract:

A new paradigm for software design and development models software by its business process, translates the model into a process execution language, and has it run by a supporting execution engine. This process-oriented paradigm promotes modeling of software by less technical users or business analysts as well as rapid development. Since business process models may be shared by different organizations and sometimes even by different business domains, it is interesting to apply a technique used in traditional software component technology to design reusable business processes. This paper discusses an approach to apply a technique for software component fabrication to the design of process-oriented software units, called process components. These process components result from decomposing a business process of a particular application domain into subprocesses with an aim that the process components can be reusable in different process-based software models. The approach is quantitative because the quality of process component design is measured from technical features of the process components. The approach is also strategic because the measured quality is determined against business-oriented component management goals. A software tool has been developed to measure how good a process component design is, according to the required managerial goals and comparing to other designs. We also discuss how we benefit from reusable process components.

Keywords: Business process model, process component, component management goals, measurement

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1627
9066 Evaluation Process for the Hardware Safety Integrity Level

Authors: Sung Kyu Kim, Yong Soo Kim

Abstract:

Safety instrumented systems (SISs) are becoming increasingly complex and the proportion of programmable electronic parts is growing. The IEC 61508 global standard was established to ensure the functional safety of SISs, but it was expressed in highly macroscopic terms. This study introduces an evaluation process for hardware safety integrity levels through failure modes, effects, and diagnostic analysis (FMEDA).FMEDA is widely used to evaluate safety levels, and it provides the information on failure rates and failure mode distributions necessary to calculate a diagnostic coverage factor for a given component. In our evaluation process, the components of the SIS subsystem are first defined in terms of failure modes and effects. Then, the failure rate and failure mechanism distribution are assigned to each component. The safety mode and detectability of each failure mode are determined for each component. Finally, the hardware safety integrity level is evaluated based on the calculated results.

Keywords: Safety instrumented system; Safety integrity level; Failure modes, effects, and diagnostic analysis; IEC 61508.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2483
9065 A New Traffic Pattern Matching for DDoS Traceback Using Independent Component Analysis

Authors: Yuji Waizumi, Tohru Sato, Yoshiaki Nemoto

Abstract:

Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.

Keywords: Distributed Denial of Service, Independent Component Analysis, Traffic pattern

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1734
9064 An ICA Algorithm for Separation of Convolutive Mixture of Speech Signals

Authors: Rajkishore Prasad, Hiroshi Saruwatari, Kiyohiro Shikano

Abstract:

This paper describes Independent Component Analysis (ICA) based fixed-point algorithm for the blind separation of the convolutive mixture of speech, picked-up by a linear microphone array. The proposed algorithm extracts independent sources by non- Gaussianizing the Time-Frequency Series of Speech (TFSS) in a deflationary way. The degree of non-Gaussianization is measured by negentropy. The relative performances of algorithm under random initialization and Null beamformer (NBF) based initialization are studied. It has been found that an NBF based initial value gives speedy convergence as well as better separation performance

Keywords: Blind signal separation, independent component analysis, negentropy, convolutive mixture.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1729