Search results for: Level Sets
3857 A Note on the Minimum Cardinality of Critical Sets of Inertias for Irreducible Zero-nonzero Patterns of Order 4
Authors: Ber-Lin Yu, Ting-Zhu Huang
Abstract:
If there exists a nonempty, proper subset S of the set of all (n+1)(n+2)/2 inertias such that S Ôèå i(A) is sufficient for any n×n zero-nonzero pattern A to be inertially arbitrary, then S is called a critical set of inertias for zero-nonzero patterns of order n. If no proper subset of S is a critical set, then S is called a minimal critical set of inertias. In [Kim, Olesky and Driessche, Critical sets of inertias for matrix patterns, Linear and Multilinear Algebra, 57 (3) (2009) 293-306], identifying all minimal critical sets of inertias for n×n zero-nonzero patterns with n ≥ 3 and the minimum cardinality of such a set are posed as two open questions by Kim, Olesky and Driessche. In this note, the minimum cardinality of all critical sets of inertias for 4 × 4 irreducible zero-nonzero patterns is identified.
Keywords: Zero-nonzero pattern, inertia, critical set of inertias, inertially arbitrary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11983856 Fuzzy Mathematical Morphology approach in Image Processing
Authors: Yee Yee Htun, Dr. Khaing Khaing Aye
Abstract:
Morphological operators transform the original image into another image through the interaction with the other image of certain shape and size which is known as the structure element. Mathematical morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been applied widely too many applications such as edge detection, objection segmentation, noise suppression and so on. Fuzzy Mathematical Morphology aims to extend the binary morphological operators to grey-level images. In order to define the basic morphological operations such as fuzzy erosion, dilation, opening and closing, a general method based upon fuzzy implication and inclusion grade operators is introduced. The fuzzy morphological operations extend the ordinary morphological operations by using fuzzy sets where for fuzzy sets, the union operation is replaced by a maximum operation, and the intersection operation is replaced by a minimum operation. In this work, it consists of two articles. In the first one, fuzzy set theory, fuzzy Mathematical morphology which is based on fuzzy logic and fuzzy set theory; fuzzy Mathematical operations and their properties will be studied in details. As a second part, the application of fuzziness in Mathematical morphology in practical work such as image processing will be discussed with the illustration problems.Keywords: Binary Morphological, Fuzzy sets, Grayscalemorphology, Image processing, Mathematical morphology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32473855 Multi-labeled Data Expressed by a Set of Labels
Authors: Tetsuya Furukawa, Masahiro Kuzunishi
Abstract:
Collected data must be organized to be utilized efficiently, and hierarchical classification of data is efficient approach to organize data. When data is classified to multiple categories or annotated with a set of labels, users request multi-labeled data by giving a set of labels. There are several interpretations of the data expressed by a set of labels. This paper discusses which data is expressed by a set of labels by introducing orders for sets of labels and shows that there are four types of orders, which are characterized by whether the labels of expressed data includes every label of the given set of labels within the range of the set. Desirable properties of the orders, data is also expressed by the higher set of labels and different sets of labels express different data, are discussed for the orders.
Keywords: Classification Hierarchies, Multi-labeled Data, Multiple Classificaiton, Orders of Sets of Labels
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13043854 Heterogeneous Attribute Reduction in Noisy System based on a Generalized Neighborhood Rough Sets Model
Authors: Siyuan Jing, Kun She
Abstract:
Neighborhood Rough Sets (NRS) has been proven to be an efficient tool for heterogeneous attribute reduction. However, most of researches are focused on dealing with complete and noiseless data. Factually, most of the information systems are noisy, namely, filled with incomplete data and inconsistent data. In this paper, we introduce a generalized neighborhood rough sets model, called VPTNRS, to deal with the problem of heterogeneous attribute reduction in noisy system. We generalize classical NRS model with tolerance neighborhood relation and the probabilistic theory. Furthermore, we use the neighborhood dependency to evaluate the significance of a subset of heterogeneous attributes and construct a forward greedy algorithm for attribute reduction based on it. Experimental results show that the model is efficient to deal with noisy data.Keywords: attribute reduction, incomplete data, inconsistent data, tolerance neighborhood relation, rough sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15883853 Relationship between Level of Physical Activity and Exercise Imagery among Klang Valley Citizens
Authors: Kok, M.O., Omar-Fauzee, M.S., Rosli, M.H.
Abstract:
This study investigated the relationship between exercise imagery use and level of physical activity within a wide range of exercisers in Klang valley, Malaysia. One hundred and twenty four respondents (Mage = 28.92, SD = 9.34) completed two sets of questionnaires (Exercise Imagery Inventory and Leisure-Time Exercise Questionnaire) that measure the use of imagery and exercise frequency of participants. From the result obtained, exercise imagery is found to be significantly correlated to level of physical activity. Besides that, variables such as gender, age and ethnicity that may affect the use of imagery and exercise frequency were also being assessed in this study. Among all variables, only ethnicity showed significant difference in level of physical activity (p < 0.05). Findings in this study suggest that further investigation should be done on other variables such as socioeconomic, educational level, and selfefficacy that may affect the imagery use and frequency of physical activity among exercisers.Keywords: Physical activity, exercise imagery, ExerciseImagery Inventory, Leisure-Time Exercise Questionnaire
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24473852 Piecewise Interpolation Filter for Effective Processing of Large Signal Sets
Authors: Anatoli Torokhti, Stanley Miklavcic
Abstract:
Suppose KY and KX are large sets of observed and reference signals, respectively, each containing N signals. Is it possible to construct a filter F : KY → KX that requires a priori information only on few signals, p N, from KX but performs better than the known filters based on a priori information on every reference signal from KX? It is shown that the positive answer is achievable under quite unrestrictive assumptions. The device behind the proposed method is based on a special extension of the piecewise linear interpolation technique to the case of random signal sets. The proposed technique provides a single filter to process any signal from the arbitrarily large signal set. The filter is determined in terms of pseudo-inverse matrices so that it always exists.Keywords: Wiener filter, filtering of stochastic signals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14123851 Comanche – A Compiler-Driven I/O Management System
Authors: Wendy Zhang, Ernst L. Leiss, Huilin Ye
Abstract:
Most scientific programs have large input and output data sets that require out-of-core programming or use virtual memory management (VMM). Out-of-core programming is very error-prone and tedious; as a result, it is generally avoided. However, in many instance, VMM is not an effective approach because it often results in substantial performance reduction. In contrast, compiler driven I/O management will allow a program-s data sets to be retrieved in parts, called blocks or tiles. Comanche (COmpiler MANaged caCHE) is a compiler combined with a user level runtime system that can be used to replace standard VMM for out-of-core programs. We describe Comanche and demonstrate on a number of representative problems that it substantially out-performs VMM. Significantly our system does not require any special services from the operating system and does not require modification of the operating system kernel.Keywords: I/O Management, Out-of-core, Compiler, Tile mapping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13183850 Holomorphic Prioritization of Sets within Decagram of Strategic Decision Making of POSM Using Operational Research (OR): Analytic Hierarchy Process (AHP) Analysis
Authors: Elias O. Tembe, Hussain A. Al-Salamin
Abstract:
There is decagram of strategic decisions of operations and production/service management (POSM) within operational research (OR) which must collate, namely: design, inventory, quality, location, process and capacity, layout, scheduling, maintain ace, and supply chain. This paper presents an architectural configuration conceptual framework of a decagram of sets decisions in a form of mathematical complete graph and abelian graph. Mathematically, a complete graph is undirected (UDG), and directed (DG) a relationship where every pair of vertices is connected, collated, confluent, and holomorphic. There has not been any study conducted which, however, prioritizes the holomorphic sets which of POMS within OR field of study. The study utilizes OR structured technique known as The Analytic Hierarchy Process (AHP) analysis for organizing, sorting and prioritizing(ranking) the sets within the decagram of POMS according to their attribution (propensity), and provides an analysis how the prioritization has real-world application within the 21st century.
Keywords: AHP analysis, Decagram, Decagon, Holomorphic.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20003849 New Scheme in Determining nth Order Diagrams for Cross Multiplication Method via Combinatorial Approach
Authors: Sharmila Karim, Haslinda Ibrahim, Zurni Omar
Abstract:
In this paper, a new recursive strategy is proposed for determining $\frac{(n-1)!}{2}$ of $n$th order diagrams. The generalization of $n$th diagram for cross multiplication method were proposed by Pavlovic and Bankier but the specific rule of determining $\frac{(n-1)!}{2}$ of the $n$th order diagrams for square matrix is yet to be discovered. Thus using combinatorial approach, $\frac{(n-1)!}{2}$ of the $n$th order diagrams will be presented as $\frac{(n-1)!}{2}$ starter sets. These starter sets will be generated based on exchanging one element. The advantages of this new strategy are the discarding process was eliminated and the sign of starter set is alternated to each others.
Keywords: starter sets, permutation, exchanging one element, determinant
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12023848 REDUCER – An Architectural Design Pattern for Reducing Large and Noisy Data Sets
Authors: Apkar Salatian
Abstract:
To relieve the burden of reasoning on a point to point basis, in many domains there is a need to reduce large and noisy data sets into trends for qualitative reasoning. In this paper we propose and describe a new architectural design pattern called REDUCER for reducing large and noisy data sets that can be tailored for particular situations. REDUCER consists of 2 consecutive processes: Filter which takes the original data and removes outliers, inconsistencies or noise; and Compression which takes the filtered data and derives trends in the data. In this seminal article we also show how REDUCER has successfully been applied to 3 different case studies.
Keywords: Design Pattern, filtering, compression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14913847 Novel and Different Definitions for Fuzzy Union and Intersection Operations
Authors: Aarthi Chandramohan, M. V. C. Rao
Abstract:
This paper presents three new methodologies for the basic operations, which aim at finding new ways of computing union (maximum) and intersection (minimum) membership values by taking into effect the entire membership values in a fuzzy set. The new methodologies are conceptually simple and easy from the application point of view and are illustrated with a variety of problems such as Cartesian product of two fuzzy sets, max –min composition of two fuzzy sets in different product spaces and an application of an inverted pendulum to determine the impact of the new methodologies. The results clearly indicate a difference based on the nature of the fuzzy sets under consideration and hence will be highly useful in quite a few applications where different values have significant impact on the behavior of the system.Keywords: Centroid, fuzzy set operations, intersection, triangular norms , triangular S-norms, union.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15053846 Aerial Firefighting Aircraft Selection with Standard Fuzzy Sets using Multiple Criteria Group Decision Making Analysis
Authors: C. Ardil
Abstract:
Aircraft selection decisions can be challenging due to their multidimensional and interdisciplinary nature. They involve multiple stakeholders with conflicting objectives and numerous alternative options with uncertain outcomes. This study focuses on the analysis of aerial firefighting aircraft that can be chosen for the Air Fire Service to extinguish forest fires. To make such a selection, the characteristics of the fire zones must be considered, and the capability to manage the logistics involved in such operations, as well as the purchase and maintenance of the aircraft, must be determined. The selection of firefighting aircraft is particularly complex because they have longer fleet lives and require more demanding operation and maintenance than scheduled passenger air service. This paper aims to use the fuzzy proximity measure method to select the most appropriate aerial firefighting aircraft based on decision criteria using multiple attribute decision making analysis. Following fuzzy decision analysis, the most suitable aerial firefighting aircraft is ranked and determined for the Air Fire Service.
Keywords: Aerial firefighting aircraft selection, multiple criteria decision making, fuzzy sets, standard fuzzy sets, determinate fuzzy sets, indeterminate fuzzy sets, proximity measure method, Minkowski distance family function, Hausdorff distance function, MCDM, PMM, PMM-F
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4003845 Comparative Study of Decision Trees and Rough Sets Theory as Knowledge ExtractionTools for Design and Control of Industrial Processes
Authors: Marcin Perzyk, Artur Soroczynski
Abstract:
General requirements for knowledge representation in the form of logic rules, applicable to design and control of industrial processes, are formulated. Characteristic behavior of decision trees (DTs) and rough sets theory (RST) in rules extraction from recorded data is discussed and illustrated with simple examples. The significance of the models- drawbacks was evaluated, using simulated and industrial data sets. It is concluded that performance of DTs may be considerably poorer in several important aspects, compared to RST, particularly when not only a characterization of a problem is required, but also detailed and precise rules are needed, according to actual, specific problems to be solved.Keywords: Knowledge extraction, decision trees, rough setstheory, industrial processes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16333844 A Constructive Proof of the General Brouwer Fixed Point Theorem and Related Computational Results in General Non-Convex sets
Authors: Menglong Su, Shaoyun Shi, Qing Xu
Abstract:
In this paper, by introducing twice continuously differentiable mappings, we develop an interior path following following method, which enables us to give a constructive proof of the general Brouwer fixed point theorem and thus to solve fixed point problems in a class of non-convex sets. Under suitable conditions, a smooth path can be proven to exist. This can lead to an implementable globally convergent algorithm. Several numerical examples are given to illustrate the results of this paper.
Keywords: interior path following method, general Brouwer fixed point theorem, non-convex sets, globally convergent algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14473843 A New Reliability Allocation Method Based On Fuzzy Numbers
Authors: Peng Li, Chuanri Li, Tao Li
Abstract:
Reliability allocation is quite important during early design and development stages for a system to apportion its specified reliability goal to subsystems. This paper improves the reliability fuzzy allocation method, and gives concrete processes on determining the factor and sub-factor sets, weight sets, judgment set, and multi-stage fuzzy evaluation. To determine the weight of factor and sub-factor sets, the modified trapezoidal numbers are proposed to reduce errors caused by subjective factors. To decrease the fuzziness in fuzzy division, an approximation method based on linear programming is employed. To compute the explicit values of fuzzy numbers, centroid method of defuzzification is considered. An example is provided to illustrate the application of the proposed reliability allocation method based on fuzzy arithmetic.
Keywords: Reliability allocation, fuzzy arithmetic, allocation weight.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33303842 Improving Image Segmentation Performance via Edge Preserving Regularization
Authors: Ying-jie Zhang, Li-ling Ge
Abstract:
This paper presents an improved image segmentation model with edge preserving regularization based on the piecewise-smooth Mumford-Shah functional. A level set formulation is considered for the Mumford-Shah functional minimization in segmentation, and the corresponding partial difference equations are solved by the backward Euler discretization. Aiming at encouraging edge preserving regularization, a new edge indicator function is introduced at level set frame. In which all the grid points which is used to locate the level set curve are considered to avoid blurring the edges and a nonlinear smooth constraint function as regularization term is applied to smooth the image in the isophote direction instead of the gradient direction. In implementation, some strategies such as a new scheme for extension of u+ and u- computation of the grid points and speedup of the convergence are studied to improve the efficacy of the algorithm. The resulting algorithm has been implemented and compared with the previous methods, and has been proved efficiently by several cases.Keywords: Energy minimization, image segmentation, level sets, edge regularization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14993841 A Study on Optimal Determination of Partial Transmission Ratios of Helical Gearboxes with Second-Step Double Gear-Sets
Authors: Vu Ngoc Pi
Abstract:
In this paper, a study on the applications of the optimization and regression techniques for optimal calculation of partial ratios of helical gearboxes with second-step double gear-sets for minimal cross section dimension is introduced. From the condition of the moment equilibrium of a mechanic system including three gear units and their regular resistance condition, models for calculation of the partial ratios of helical gearboxes with second-step double gear-sets were given. Especially, by regression analysis, explicit models for calculation of the partial ratios are introduced. These models allow determining the partial ratios accurately and simply.Keywords: Gearbox design, optimal design, helical gearbox, transmission ratio.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16433840 A Genetic Algorithm for Clustering on Image Data
Authors: Qin Ding, Jim Gasvoda
Abstract:
Clustering is the process of subdividing an input data set into a desired number of subgroups so that members of the same subgroup are similar and members of different subgroups have diverse properties. Many heuristic algorithms have been applied to the clustering problem, which is known to be NP Hard. Genetic algorithms have been used in a wide variety of fields to perform clustering, however, the technique normally has a long running time in terms of input set size. This paper proposes an efficient genetic algorithm for clustering on very large data sets, especially on image data sets. The genetic algorithm uses the most time efficient techniques along with preprocessing of the input data set. We test our algorithm on both artificial and real image data sets, both of which are of large size. The experimental results show that our algorithm outperforms the k-means algorithm in terms of running time as well as the quality of the clustering.
Keywords: Clustering, data mining, genetic algorithm, image data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20533839 Heuristic Search Algorithm (HSA) for Enhancing the Lifetime of Wireless Sensor Networks
Authors: Tripatjot S. Panag, J. S. Dhillon
Abstract:
The lifetime of a wireless sensor network can be effectively increased by using scheduling operations. Once the sensors are randomly deployed, the task at hand is to find the largest number of disjoint sets of sensors such that every sensor set provides complete coverage of the target area. At any instant, only one of these disjoint sets is switched on, while all other are switched off. This paper proposes a heuristic search method to find the maximum number of disjoint sets that completely cover the region. A population of randomly initialized members is made to explore the solution space. A set of heuristics has been applied to guide the members to a possible solution in their neighborhood. The heuristics escalate the convergence of the algorithm. The best solution explored by the population is recorded and is continuously updated. The proposed algorithm has been tested for applications which require sensing of multiple target points, referred to as point coverage applications. Results show that the proposed algorithm outclasses the existing algorithms. It always finds the optimum solution, and that too by making fewer number of fitness function evaluations than the existing approaches.Keywords: Coverage, disjoint sets, heuristic, lifetime, scheduling, wireless sensor networks, WSN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18413838 Inconsistency Discovery in Multiple State Diagrams
Authors: Mohammad N. Alanazi, David A. Gustafson
Abstract:
In this article, we introduce a new approach for analyzing UML designs to detect the inconsistencies between multiple state diagrams and sequence diagrams. The Super State Analysis (SSA) identifies the inconsistencies in super states, single step transitions, and sequences. Because SSA considers multiple UML state diagrams, it discovers inconsistencies that cannot be discovered when considering only a single UML state diagram. We have introduced a transition set that captures relationship information that is not specifiable in UML diagrams. The SSA model uses the transition set to link transitions of multiple state diagrams together. The analysis generates three different sets automatically. These sets are compared to the provided sets to detect the inconsistencies. SSA identifies five types of inconsistencies: impossible super states, unreachable super states, illegal transitions, missing transitions, and illegal sequences.Keywords: Modeling Languages, Object-Oriented Analysis, Sequence Diagrams, Software Models, State Diagrams, UML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16513837 Minimizing Mutant Sets by Equivalence and Subsumption
Authors: Samia Alblwi, Amani Ayad
Abstract:
Mutation testing is the art of generating syntactic variations of a base program and checking whether a candidate test suite can identify all the mutants that are not semantically equivalent to the base; this technique can be used to assess the quality of test suite. One of the main obstacles to the widespread use of mutation testing is cost, as even small programs (a few dozen lines of code) can give rise to a large number of mutants (up to hundreds); this has created an incentive to seek to reduce the number of mutants while preserving their collective effectiveness. Two criteria have been used to reduce the size of mutant sets: equivalence, which aims to partition the set of mutants into equivalence classes modulo semantic equivalence, and selecting one representative per class; and, subsumption, which aims to define a partial ordering among mutants that ranks mutants by effectiveness and seeks to select maximal elements in this ordering. In this paper, we analyze these two policies using analytical and empirical criteria.
Keywords: Mutation testing, mutant sets, mutant equivalence, mutant subsumption, mutant set minimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1943836 Lithofacies Classification from Well Log Data Using Neural Networks, Interval Neutrosophic Sets and Quantification of Uncertainty
Authors: Pawalai Kraipeerapun, Chun Che Fung, Kok Wai Wong
Abstract:
This paper proposes a novel approach to the question of lithofacies classification based on an assessment of the uncertainty in the classification results. The proposed approach has multiple neural networks (NN), and interval neutrosophic sets (INS) are used to classify the input well log data into outputs of multiple classes of lithofacies. A pair of n-class neural networks are used to predict n-degree of truth memberships and n-degree of false memberships. Indeterminacy memberships or uncertainties in the predictions are estimated using a multidimensional interpolation method. These three memberships form the INS used to support the confidence in results of multiclass classification. Based on the experimental data, our approach improves the classification performance as compared to an existing technique applied only to the truth membership. In addition, our approach has the capability to provide a measure of uncertainty in the problem of multiclass classification.
Keywords: Multiclass classification, feed-forward backpropagation neural network, interval neutrosophic sets, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16343835 Locating Center Points for Radial Basis Function Networks Using Instance Reduction Techniques
Authors: Rana Yousef, Khalil el Hindi
Abstract:
The behavior of Radial Basis Function (RBF) Networks greatly depends on how the center points of the basis functions are selected. In this work we investigate the use of instance reduction techniques, originally developed to reduce the storage requirements of instance based learners, for this purpose. Five Instance-Based Reduction Techniques were used to determine the set of center points, and RBF networks were trained using these sets of centers. The performance of the RBF networks is studied in terms of classification accuracy and training time. The results obtained were compared with two Radial Basis Function Networks: RBF networks that use all instances of the training set as center points (RBF-ALL) and Probabilistic Neural Networks (PNN). The former achieves high classification accuracies and the latter requires smaller training time. Results showed that RBF networks trained using sets of centers located by noise-filtering techniques (ALLKNN and ENN) rather than pure reduction techniques produce the best results in terms of classification accuracy. The results show that these networks require smaller training time than that of RBF-ALL and higher classification accuracy than that of PNN. Thus, using ALLKNN and ENN to select center points gives better combination of classification accuracy and training time. Our experiments also show that using the reduced sets to train the networks is beneficial especially in the presence of noise in the original training sets.
Keywords: Radial basis function networks, Instance-based reduction, PNN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16883834 Evaluating Refactoring with a Quality Index
Authors: Crt Gerlec, Marjan Hericko
Abstract:
The aim of every software product is to achieve an appropriate level of software quality. Developers and designers are trying to produce readable, reliable, maintainable, reusable and testable code. To help achieve these goals, several approaches have been utilized. In this paper, refactoring technique was used to evaluate software quality with a quality index. It is composed of different metric sets which describes various quality aspects.Keywords: Refactoring, Software Metrics, Software Quality, Quality Index, Agile methodologies
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16233833 Modeling the Symptom-Disease Relationship by Using Rough Set Theory and Formal Concept Analysis
Authors: Mert Bal, Hayri Sever, Oya Kalıpsız
Abstract:
Medical Decision Support Systems (MDSSs) are sophisticated, intelligent systems that can provide inference due to lack of information and uncertainty. In such systems, to model the uncertainty various soft computing methods such as Bayesian networks, rough sets, artificial neural networks, fuzzy logic, inductive logic programming and genetic algorithms and hybrid methods that formed from the combination of the few mentioned methods are used. In this study, symptom-disease relationships are presented by a framework which is modeled with a formal concept analysis and theory, as diseases, objects and attributes of symptoms. After a concept lattice is formed, Bayes theorem can be used to determine the relationships between attributes and objects. A discernibility relation that forms the base of the rough sets can be applied to attribute data sets in order to reduce attributes and decrease the complexity of computation.
Keywords: Formal Concept Analysis, Rough Set Theory, Granular Computing, Medical Decision Support System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18143832 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and roughsets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.
Keywords: Rough-sets, Classification, Feature Selection, Entropy, Outliers, Frequent itemset mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24353831 Determinate Fuzzy Set Ranking Analysis for Combat Aircraft Selection with Multiple Criteria Group Decision Making
Authors: C. Ardil
Abstract:
Using the aid of Hausdorff distance function and Minkowski distance function, this study proposes a novel method for selecting combat aircraft for Air Force. In order to do this, the proximity measure method was developed with determinate fuzzy degrees based on the relationship between attributes and combat aircraft alternatives. The combat aircraft selection attributes were identified as payloadability, maneuverability, speedability, stealthability, and survivability. Determinate fuzzy data from the combat aircraft attributes was then aggregated using the determinate fuzzy weighted arithmetic average operator. For the selection of combat aircraft, correlation analysis of the ranking order patterns of options was also examined. A numerical example from military aviation is used to demonstrate the applicability and effectiveness of the proposed method.
Keywords: Combat aircraft selection, multiple criteria decision making, fuzzy sets, determinate fuzzy sets, intuitionistic fuzzy sets, proximity measure method, Hausdorff distance function, Minkowski distance function, PMM, MCDM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3553830 A Family of Entropies on Interval-valued Intuitionistic Fuzzy Sets and Their Applications in Multiple Attribute Decision Making
Abstract:
The entropy of intuitionistic fuzzy sets is used to indicate the degree of fuzziness of an interval-valued intuitionistic fuzzy set(IvIFS). In this paper, we deal with the entropies of IvIFS. Firstly, we propose a family of entropies on IvIFS with a parameter λ ∈ [0, 1], which generalize two entropy measures defined independently by Zhang and Wei, for IvIFS, and then we prove that the new entropy is an increasing function with respect to the parameter λ. Furthermore, a new multiple attribute decision making (MADM) method using entropy-based attribute weights is proposed to deal with the decision making situations where the alternatives on attributes are expressed by IvIFS and the attribute weights information is unknown. Finally, a numerical example is given to illustrate the applications of the proposed method.
Keywords: Interval-valued intuitionistic fuzzy sets, intervalvalued intuitionistic fuzzy entropy, multiple attribute decision making
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16473829 Proposing an Efficient Method for Frequent Pattern Mining
Authors: Vaibhav Kant Singh, Vijay Shah, Yogendra Kumar Jain, Anupam Shukla, A.S. Thoke, Vinay KumarSingh, Chhaya Dule, Vivek Parganiha
Abstract:
Data mining, which is the exploration of knowledge from the large set of data, generated as a result of the various data processing activities. Frequent Pattern Mining is a very important task in data mining. The previous approaches applied to generate frequent set generally adopt candidate generation and pruning techniques for the satisfaction of the desired objective. This paper shows how the different approaches achieve the objective of frequent mining along with the complexities required to perform the job. This paper will also look for hardware approach of cache coherence to improve efficiency of the above process. The process of data mining is helpful in generation of support systems that can help in Management, Bioinformatics, Biotechnology, Medical Science, Statistics, Mathematics, Banking, Networking and other Computer related applications. This paper proposes the use of both upward and downward closure property for the extraction of frequent item sets which reduces the total number of scans required for the generation of Candidate Sets.Keywords: Data Mining, Candidate Sets, Frequent Item set, Pruning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16853828 Mathematical Modeling to Predict Surface Roughness in CNC Milling
Authors: Ab. Rashid M.F.F., Gan S.Y., Muhammad N.Y.
Abstract:
Surface roughness (Ra) is one of the most important requirements in machining process. In order to obtain better surface roughness, the proper setting of cutting parameters is crucial before the process take place. This research presents the development of mathematical model for surface roughness prediction before milling process in order to evaluate the fitness of machining parameters; spindle speed, feed rate and depth of cut. 84 samples were run in this study by using FANUC CNC Milling α-Τ14ιE. Those samples were randomly divided into two data sets- the training sets (m=60) and testing sets(m=24). ANOVA analysis showed that at least one of the population regression coefficients was not zero. Multiple Regression Method was used to determine the correlation between a criterion variable and a combination of predictor variables. It was established that the surface roughness is most influenced by the feed rate. By using Multiple Regression Method equation, the average percentage deviation of the testing set was 9.8% and 9.7% for training data set. This showed that the statistical model could predict the surface roughness with about 90.2% accuracy of the testing data set and 90.3% accuracy of the training data set.
Keywords: Surface roughness, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2132