Search results for: Candidate Sets
788 Web Log Mining by an Improved AprioriAll Algorithm
Authors: Wang Tong, He Pi-lian
Abstract:
This paper sets forth the possibility and importance about applying Data Mining in Web logs mining and shows some problems in the conventional searching engines. Then it offers an improved algorithm based on the original AprioriAll algorithm which has been used in Web logs mining widely. The new algorithm adds the property of the User ID during the every step of producing the candidate set and every step of scanning the database by which to decide whether an item in the candidate set should be put into the large set which will be used to produce next candidate set. At the meantime, in order to reduce the number of the database scanning, the new algorithm, by using the property of the Apriori algorithm, limits the size of the candidate set in time whenever it is produced. Test results show the improved algorithm has a more lower complexity of time and space, better restrain noise and fit the capacity of memory.
Keywords: Candidate Sets Pruning, Data Mining, ImprovedAlgorithm, Noise Restrain, Web Log
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2279787 Proposing an Efficient Method for Frequent Pattern Mining
Authors: Vaibhav Kant Singh, Vijay Shah, Yogendra Kumar Jain, Anupam Shukla, A.S. Thoke, Vinay KumarSingh, Chhaya Dule, Vivek Parganiha
Abstract:
Data mining, which is the exploration of knowledge from the large set of data, generated as a result of the various data processing activities. Frequent Pattern Mining is a very important task in data mining. The previous approaches applied to generate frequent set generally adopt candidate generation and pruning techniques for the satisfaction of the desired objective. This paper shows how the different approaches achieve the objective of frequent mining along with the complexities required to perform the job. This paper will also look for hardware approach of cache coherence to improve efficiency of the above process. The process of data mining is helpful in generation of support systems that can help in Management, Bioinformatics, Biotechnology, Medical Science, Statistics, Mathematics, Banking, Networking and other Computer related applications. This paper proposes the use of both upward and downward closure property for the extraction of frequent item sets which reduces the total number of scans required for the generation of Candidate Sets.Keywords: Data Mining, Candidate Sets, Frequent Item set, Pruning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1681786 Structure of Covering-based Rough Sets
Authors: Shiping Wang, Peiyong Zhu, William Zhu
Abstract:
Rough set theory is a very effective tool to deal with granularity and vagueness in information systems. Covering-based rough set theory is an extension of classical rough set theory. In this paper, firstly we present the characteristics of the reducible element and the minimal description covering-based rough sets through downsets. Then we establish lattices and topological spaces in coveringbased rough sets through down-sets and up-sets. In this way, one can investigate covering-based rough sets from algebraic and topological points of view.
Keywords: Covering, poset, down-set, lattice, topological space, topological base.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1848785 Automatic Microaneurysm Quantification for Diabetic Retinopathy Screening
Authors: A. Sopharak, B. Uyyanonvara, S. Barman
Abstract:
Microaneurysm is a key indicator of diabetic retinopathy that can potentially cause damage to retina. Early detection and automatic quantification are the keys to prevent further damage. In this paper, which focuses on automatic microaneurysm detection in images acquired through non-dilated pupils, we present a series of experiments on feature selection and automatic microaneurysm pixel classification. We found that the best feature set is a combination of 10 features: the pixel-s intensity of shade corrected image, the pixel hue, the standard deviation of shade corrected image, DoG4, the area of the candidate MA, the perimeter of the candidate MA, the eccentricity of the candidate MA, the circularity of the candidate MA, the mean intensity of the candidate MA on shade corrected image and the ratio of the major axis length and minor length of the candidate MA. The overall sensitivity, specificity, precision, and accuracy are 84.82%, 99.99%, 89.01%, and 99.99%, respectively.
Keywords: Diabetic retinopathy, microaneurysm, naive Bayes classifier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190784 Regular Generalized Star Star closed sets in Bitopological Spaces
Authors: K. Kannan, D. Narasimhan, K. Chandrasekhara Rao, R. Ravikumar
Abstract:
The aim of this paper is to introduce the concepts of τ1τ2-regular generalized star star closed sets , τ1τ2-regular generalized star star open sets and study their basic properties in bitopological spaces.
Keywords: τ1τ2-regular closed sets, τ1τ2-regular open sets, τ1τ2-regular generalized closed sets, τ1τ2-regular generalized star closed sets, τ1τ2-regular generalized star star closed sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2210783 Covering-based Rough sets Based on the Refinement of Covering-element
Authors: Jianguo Tang, Kun She, William Zhu
Abstract:
Covering-based rough sets is an extension of rough sets and it is based on a covering instead of a partition of the universe. Therefore it is more powerful in describing some practical problems than rough sets. However, by extending the rough sets, covering-based rough sets can increase the roughness of each model in recognizing objects. How to obtain better approximations from the models of a covering-based rough sets is an important issue. In this paper, two concepts, determinate elements and indeterminate elements in a universe, are proposed and given precise definitions respectively. This research makes a reasonable refinement of the covering-element from a new viewpoint. And the refinement may generate better approximations of covering-based rough sets models. To prove the theory above, it is applied to eight major coveringbased rough sets models which are adapted from other literature. The result is, in all these models, the lower approximation increases effectively. Correspondingly, in all models, the upper approximation decreases with exceptions of two models in some special situations. Therefore, the roughness of recognizing objects is reduced. This research provides a new approach to the study and application of covering-based rough sets.Keywords: Determinate element, indeterminate element, refinementof covering-element, refinement of covering, covering-basedrough sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1322782 On Fuzzy Weakly-Closed Sets
Authors: J. Mahanta, P.K. Das
Abstract:
A new class of fuzzy closed sets, namely fuzzy weakly closed set in a fuzzy topological space is introduced and it is established that this class of fuzzy closed sets lies between fuzzy closed sets and fuzzy generalized closed sets. Alongwith the study of fundamental results of such closed sets, we define and characterize fuzzy weakly compact space and fuzzy weakly closed space.
Keywords: Fuzzy weakly-closed set, fuzzy weakly-closed space, fuzzy weakly-compactness, MSC: 54A40, 54D30.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775781 Minimizing Mutant Sets by Equivalence and Subsumption
Authors: Samia Alblwi, Amani Ayad
Abstract:
Mutation testing is the art of generating syntactic variations of a base program and checking whether a candidate test suite can identify all the mutants that are not semantically equivalent to the base; this technique can be used to assess the quality of test suite. One of the main obstacles to the widespread use of mutation testing is cost, as even small programs (a few dozen lines of code) can give rise to a large number of mutants (up to hundreds); this has created an incentive to seek to reduce the number of mutants while preserving their collective effectiveness. Two criteria have been used to reduce the size of mutant sets: equivalence, which aims to partition the set of mutants into equivalence classes modulo semantic equivalence, and selecting one representative per class; and, subsumption, which aims to define a partial ordering among mutants that ranks mutants by effectiveness and seeks to select maximal elements in this ordering. In this paper, we analyze these two policies using analytical and empirical criteria.
Keywords: Mutation testing, mutant sets, mutant equivalence, mutant subsumption, mutant set minimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193780 (T1, T2)*- Semi Star Generalized Locally Closed Sets
Authors: M. Sundararaman, K. Chandrasekhara Rao
Abstract:
The aim of this paper is to continue the study of (T1, T2)-semi star generalized closed sets by introducing the concepts of (T1, T2)-semi star generalized locally closed sets and study their basic properties in bitopological spaces.
Keywords: (T1, T2)*-semi star generalized locally closed sets, T1T2-semi star generalized closed sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471779 A New Condition for Conflicting Bifuzzy Sets Based On Intuitionistic Evaluation
Authors: Imran C.T., Syibrah M.N., Mohd Lazim A.
Abstract:
Fuzzy sets theory affirmed that the linguistic value for every contraries relation is complementary. It was stressed in the intuitionistic fuzzy sets (IFS) that the conditions for contraries relations, which are the fuzzy values, cannot be greater than one. However, complementary in two contradict phenomena are not always true. This paper proposes a new idea condition for conflicting bifuzzy sets by relaxing the condition of intuitionistic fuzzy sets. Here, we will critically forward examples using triangular fuzzy number in formulating a new condition for conflicting bifuzzy sets (CBFS). Evaluation of positive and negative in conflicting phenomena were calculated concurrently by relaxing the condition in IFS. The hypothetical illustration showed the applicability of the new condition in CBFS for solving non-complement contraries intuitionistic evaluation. This approach can be applied to any decision making where conflicting is very much exist.Keywords: Conflicting bifuzzy set, conflicting degree, fuzzy sets, fuzzy numbers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678778 Fuzzy Multiple Criteria Decision Making for Unmanned Combat Aircraft Selection Using Proximity Measure Method
Authors: C. Ardil
Abstract:
Intuitionistic fuzzy sets (IFS), Pythagorean fuzzy sets (PyFS), Picture fuzzy sets (PFS), q-rung orthopair fuzzy sets (q-ROF), Spherical fuzzy sets (SFS), T-spherical FS, and Neutrosophic sets (NS) are reviewed as multidimensional extensions of fuzzy sets in order to more explicitly and informatively describe the opinions of decision-making experts under uncertainty. To handle operations with standard fuzzy sets (SFS), the necessary operators; weighted arithmetic mean (WAM), weighted geometric mean (WGM), and Minkowski distance function are defined. The algorithm of the proposed proximity measure method (PMM) is provided with a multiple criteria group decision making method (MCDM) for use in a standard fuzzy set environment. To demonstrate the feasibility of the proposed method, the problem of selecting the best drone for an Air Force procurement request is used. The proximity measure method (PMM) based multidimensional standard fuzzy sets (SFS) is introduced to demonstrate its use with an issue involving unmanned combat aircraft selection.
Keywords: standard fuzzy sets (SFS), unmanned combat aircraft selection, multiple criteria decision making (MCDM), proximity measure method (PMM).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 359777 Investigations on Some Operations of Soft Sets
Authors: Xun Ge, Songlin Yang
Abstract:
Soft set theory was initiated by Molodtsov in 1999. In the past years, this theory had been applied to many branches of mathematics, information science and computer science. In 2003, Maji et al. introduced some operations of soft sets and gave some operational rules. Recently, some of these operational rules are pointed out to be not true. Furthermore, Ali et al., in their paper, introduced and discussed some new operations of soft sets. In this paper, we further investigate these operational rules given by Maji et al. and Ali et al.. We obtain some sufficient-necessary conditions such that corresponding operational rules hold and give correct forms for some operational rules. These results will be help for us to use rightly operational rules of soft sets in research and application of soft set theory.
Keywords: Soft sets, union, intersection, complement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1715776 A Fuzzy Nonlinear Regression Model for Interval Type-2 Fuzzy Sets
Authors: O. Poleshchuk, E.Komarov
Abstract:
This paper presents a regression model for interval type-2 fuzzy sets based on the least squares estimation technique. Unknown coefficients are assumed to be triangular fuzzy numbers. The basic idea is to determine aggregation intervals for type-1 fuzzy sets, membership functions of whose are low membership function and upper membership function of interval type-2 fuzzy set. These aggregation intervals were called weighted intervals. Low and upper membership functions of input and output interval type-2 fuzzy sets for developed regression models are considered as piecewise linear functions.
Keywords: Interval type-2 fuzzy sets, fuzzy regression, weighted interval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2218775 Generation of Sets of Synthetic Classifiers for the Evaluation of Abstract-Level Combination Methods
Authors: N. Greco, S. Impedovo, R.Modugno, G. Pirlo
Abstract:
This paper presents a new technique for generating sets of synthetic classifiers to evaluate abstract-level combination methods. The sets differ in terms of both recognition rates of the individual classifiers and degree of similarity. For this purpose, each abstract-level classifier is considered as a random variable producing one class label as the output for an input pattern. From the initial set of classifiers, new slightly different sets are generated by applying specific operators, which are defined at the purpose. Finally, the sets of synthetic classifiers have been used to estimate the performance of combination methods for abstract-level classifiers. The experimental results demonstrate the effectiveness of the proposed approach.
Keywords: Abstract-level Classifier, Dempster-Shafer Rule, Multi-expert Systems, Similarity Index, System Evaluation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1486774 Minimal Critical Sets of Inertias for Irreducible Zero-nonzero Patterns of Order 3
Authors: Ber-Lin Yu, Ting-Zhu Huang
Abstract:
If there exists a nonempty, proper subset S of the set of all (n + 1)(n + 2)/2 inertias such that S Ôèå i(A) is sufficient for any n × n zero-nonzero pattern A to be inertially arbitrary, then S is called a critical set of inertias for zero-nonzero patterns of order n. If no proper subset of S is a critical set, then S is called a minimal critical set of inertias. In [3], Kim, Olesky and Driessche identified all minimal critical sets of inertias for 2 × 2 zero-nonzero patterns. Identifying all minimal critical sets of inertias for n × n zero-nonzero patterns with n ≥ 3 is posed as an open question in [3]. In this paper, all minimal critical sets of inertias for 3 × 3 zero-nonzero patterns are identified. It is shown that the sets {(0, 0, 3), (3, 0, 0)}, {(0, 0, 3), (0, 3, 0)}, {(0, 0, 3), (0, 1, 2)}, {(0, 0, 3), (1, 0, 2)}, {(0, 0, 3), (2, 0, 1)} and {(0, 0, 3), (0, 2, 1)} are the only minimal critical sets of inertias for 3 × 3 irreducible zerononzero patterns.
Keywords: Permutation digraph, zero-nonzero pattern, irreducible pattern, critical set of inertias, inertially arbitrary.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236773 Some Equalities Connected with Fuzzy Soft Matrices
Authors: D. R. Jain
Abstract:
The aim of this paper is to use matrix representation of Fuzzy soft sets for proving some equalities connected with Fuzzy soft sets based on set-operations.
Keywords: Equality, Fuzzy soft matrix, Fuzzy soft sets, operations.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782772 A New Similarity Measure on Intuitionistic Fuzzy Sets
Authors: Binyamin Yusoff, Imran Taib, Lazim Abdullah, Abd Fatah Wahab
Abstract:
Intuitionistic fuzzy sets as proposed by Atanassov, have gained much attention from past and latter researchers for applications in various fields. Similarity measures between intuitionistic fuzzy sets were developed afterwards. However, it does not cater the conflicting behavior of each element evaluated. We therefore made some modification to the similarity measure of IFS by considering conflicting concept to the model. In this paper, we concentrate on Zhang and Fu-s similarity measures for IFSs and some examples are given to validate these similarity measures. A simple modification to Zhang and Fu-s similarity measures of IFSs was proposed to find the best result according to the use of degree of indeterminacy. Finally, we mark up with the application to real decision making problems.Keywords: Intuitionistic fuzzy sets, similarity measures, multicriteriadecision making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2845771 On Phase Based Stereo Matching and Its Related Issues
Authors: Andr´as R¨ovid, Takeshi Hashimoto
Abstract:
The paper focuses on the problem of the point correspondence matching in stereo images. The proposed matching algorithm is based on the combination of simpler methods such as normalized sum of squared differences (NSSD) and a more complex phase correlation based approach, by considering the noise and other factors, as well. The speed of NSSD and the preciseness of the phase correlation together yield an efficient approach to find the best candidate point with sub-pixel accuracy in stereo image pairs. The task of the NSSD in this case is to approach the candidate pixel roughly. Afterwards the location of the candidate is refined by an enhanced phase correlation based method which in contrast to the NSSD has to run only once for each selected pixel.
Keywords: Stereo matching, Sub-pixel accuracy, phase correlation, SVD, NSSD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1863770 Cantor Interpolating Spline to Design Electronic Mail Boxes
Authors: Adil Al-Rammahi
Abstract:
Electronic mail is very important in present time. Many researchers work for designing, improving, securing, fasting, goodness and others fields in electronic mail. This paper introduced new algorithm to use Cantor sets and cubic spline interpolating function in the electronic mail design. Cantor sets used as the area (or domain) of the mail, while spline function used for designing formula. The roots of spline function versus Cantor sets used as the controller admin. The roots calculated by the numerical Newton – Raphson's method. The result of this algorithm was promised.
Keywords: Cantor sets, spline, electronic mail design, Newton – Raphson's method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1597769 Meta Random Forests
Authors: Praveen Boinee, Alessandro De Angelis, Gian Luca Foresti
Abstract:
Leo Breimans Random Forests (RF) is a recent development in tree based classifiers and quickly proven to be one of the most important algorithms in the machine learning literature. It has shown robust and improved results of classifications on standard data sets. Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques to the random forests. We experiment the working of the ensembles of random forests on the standard data sets available in UCI data sets. We compare the original random forest algorithm with their ensemble counterparts and discuss the results.Keywords: Random Forests [RF], ensembles, UCI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2709768 Some Clopen Sets in the Uniform Topology on BCI-algebras
Authors: A. Hasankhani, H. Saadat, M. M. Zahedi
Abstract:
In this paper some properties of the uniformity topology on a BCI-algebras are discussed.
Keywords: (Fuzzy) ideal, (Fuzzy) subalgebra, Uniformity, clopen sets.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657767 Segmentation of Cardiac Images by the Force Field Driven Speed Term
Authors: Renato Dedic, Madjid Allili, Roger Lecomte, Adbelhamid Benchakroun
Abstract:
The class of geometric deformable models, so-called level sets, has brought tremendous impact to medical imagery. In this paper we present yet another application of level sets to medical imaging. The method we give here will in a way modify the speed term in the standard level sets equation of motion. To do so we build a potential based on the distance and the gradient of the image we study. In turn the potential gives rise to the force field: F~F(x, y) = P ∀(p,q)∈I ((x, y) - (p, q)) |ÔêçI(p,q)| |(x,y)-(p,q)| 2 . The direction and intensity of the force field at each point will determine the direction of the contour-s evolution. The images we used to test our method were produced by the Univesit'e de Sherbrooke-s PET scanners.Keywords: PET, Cardiac, Heart, Mouse, Geodesic, Geometric, Level Sets, Deformable Models, Edge Detection, Segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1210766 Low Cost Chip Set Selection Algorithm for Multi-way Partitioning of Digital System
Authors: Jae Young Park, Soongyu Kwon, Kyu Han Kim, Hyeong Geon Lee, Jong Tae Kim
Abstract:
This paper considers the problem of finding low cost chip set for a minimum cost partitioning of a large logic circuits. Chip sets are selected from a given library. Each chip in the library has a different price, area, and I/O pin. We propose a low cost chip set selection algorithm. Inputs to the algorithm are a netlist and a chip information in the library. Output is a list of chip sets satisfied with area and maximum partitioning number and it is sorted by cost. The algorithm finds the sorted list of chip sets from minimum cost to maximum cost. We used MCNC benchmark circuits for experiments. The experimental results show that all of chip sets found satisfy the multiple partitioning constraints.Keywords: lowest cost chip set, MCNC benchmark, multi-way partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503765 Character Segmentation Method for a License Plate with Topological Transform
Authors: Jaedo Kim, Youngjoon Han, Hernsoo Hahn
Abstract:
This paper propose the robust character segmentation method for license plate with topological transform such as twist,rotation. The first step of the proposed method is to find a candidate region for character and license plate. The character or license plate must be appeared as closed loop in the edge image. In the case of detecting candidate for character region, the evaluation of detected region is using topological relationship between each character. When this method decides license plate candidate region, character features in the region with binarization are used. After binarization for the detected candidate region, each character region is decided again. In this step, each character region is fitted more than previous step. In the next step, the method checks other character regions with different scale near the detected character regions, because most license plates have license numbers with some meaningful characters around them. The method uses perspective projection for geometrical normalization. If there is topological distortion in the character region, the method projects the region on a template which is defined as standard license plate using perspective projection. In this step, the method is able to separate each number region and small meaningful characters. The evaluation results are tested with a number of test images.Keywords: License Plate Detection, Character Segmentation, Perspective Projection, Topological Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934764 A New Objective Weight on Interval Type-2 Fuzzy Sets
Authors: Nurnadiah Z., Lazim A.
Abstract:
The design of weight is one of the important parts in fuzzy decision making, as it would have a deep effect on the evaluation results. Entropy is one of the weight measure based on objective evaluation. Non--probabilistic-type entropy measures for fuzzy set and interval type-2 fuzzy sets (IT2FS) have been developed and applied to weight measure. Since the entropy for (IT2FS) for decision making yet to be explored, this paper proposes a new objective weight method by using entropy weight method for multiple attribute decision making (MADM). This paper utilizes the nature of IT2FS concept in the evaluation process to assess the attribute weight based on the credibility of data. An example was presented to demonstrate the feasibility of the new method in decision making. The entropy measure of interval type-2 fuzzy sets yield flexible judgment and could be applied in decision making environment.Keywords: Objective weight, entropy weight, multiple attributedecision making, type-2 fuzzy sets, interval type-2 fuzzy sets
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660763 Comparative Study of Intuitionistic and Generalized Neutrosophic Soft Sets
Authors: Debabrata Mandal
Abstract:
The aim of this paper is to define several operations such as Intersection, Union, OR, AND operations of intuitionistic (resp. generalized) neutrosophic soft sets in the sense of Maji and compare these with intuitionistic (resp. generalized) neutrosophic soft sets in the sense of Said et al via examples. At the end of the paper, a new concept - extension is introduced, which can be used to refine our choices in case of decision making.
Keywords: AND, OR, Union, Intersection, Extension, Decision making.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1697762 Oxygen-Interstitials and Group-V Element Doping for p-Type ZnO
Authors: A. M. Gsiea, J. P. Goss, P. R. Briddon, K. M. Etmimi
Abstract:
In realizing devices using ZnO, a key challenge is the production of p-type material. Substitution of oxygen by a group-V impurity is thought to result in deep acceptor levels, but a candidate made up from a complex of a group-V impurity (P, As, Sb) on a Zn site coupled with two vacant Zn sites is widely viewed as a candidate. We show using density-functional simulations that in contrast to such a view, complexes involving oxygen interstitials are energetically more favorable, resulting in group-V impurities coordinated with four, five or six oxygen atoms.Keywords: DFT, Oxygen, p-Type, ZnO.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2486761 Uncertainty Multiple Criteria Decision Making Analysis for Stealth Combat Aircraft Selection
Authors: C. Ardil
Abstract:
Fuzzy set theory and its extensions (intuitionistic fuzzy sets, picture fuzzy sets, and neutrosophic sets) have been widely used to address imprecision and uncertainty in complex decision-making. However, they may struggle with inherent indeterminacy and inconsistency in real-world situations. This study introduces uncertainty sets as a promising alternative, offering a structured framework for incorporating both types of uncertainty into decision-making processes.This work explores the theoretical foundations and applications of uncertainty sets. A novel decision-making algorithm based on uncertainty set-based proximity measures is developed and demonstrated through a practical application: selecting the most suitable stealth combat aircraft.
The results highlight the effectiveness of uncertainty sets in ranking alternatives under uncertainty. Uncertainty sets offer several advantages, including structured uncertainty representation, robust ranking mechanisms, and enhanced decision-making capabilities due to their ability to account for ambiguity.Future research directions are also outlined, including comparative analysis with existing MCDM methods under uncertainty, sensitivity analysis to assess the robustness of rankings,and broader application to various MCDM problems with diverse complexities. By exploring these avenues, uncertainty sets can be further established as a valuable tool for navigating uncertainty in complex decision-making scenarios.
Keywords: Uncertainty set, stealth combat aircraft selection multiple criteria decision-making analysis, MCDM, uncertainty proximity analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 186760 3D Brain Tumor Segmentation Using Level-Sets Method and Meshes Simplification from Volumetric MR Images
Authors: K. Aloui, M. S. Naceur
Abstract:
The main objective of this paper is to provide an efficient tool for delineating brain tumors in three-dimensional magnetic resonance images. To achieve this goal, we use basically a level-sets approach to delineating three-dimensional brain tumors. Then we introduce a compression plan of 3D brain structures based for the meshes simplification, adapted for time to the specific needs of the telemedicine and to the capacities restricted by network communication. We present here the main stages of our system, and preliminary results which are very encouraging for clinical practice.
Keywords: Medical imaging, level-sets, compression, meshess implification, telemedicine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2132759 Imputation Technique for Feature Selection in Microarray Data Set
Authors: Younies Mahmoud, Mai Mabrouk, Elsayed Sallam
Abstract:
Analyzing DNA microarray data sets is a great challenge, which faces the bioinformaticians due to the complication of using statistical and machine learning techniques. The challenge will be doubled if the microarray data sets contain missing data, which happens regularly because these techniques cannot deal with missing data. One of the most important data analysis process on the microarray data set is feature selection. This process finds the most important genes that affect certain disease. In this paper, we introduce a technique for imputing the missing data in microarray data sets while performing feature selection.
Keywords: DNA microarray, feature selection, missing data, bioinformatics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2791