Search results for: Algorithms decision tree
2537 A Survey of Various Algorithms for Vlsi Physical Design
Authors: Rajine Swetha R, B. Shekar Babu, Sumithra Devi K.A
Abstract:
Electronic Systems are the core of everyday lives. They form an integral part in financial networks, mass transit, telephone systems, power plants and personal computers. Electronic systems are increasingly based on complex VLSI (Very Large Scale Integration) integrated circuits. Initial electronic design automation is concerned with the design and production of VLSI systems. The next important step in creating a VLSI circuit is Physical Design. The input to the physical design is a logical representation of the system under design. The output of this step is the layout of a physical package that optimally or near optimally realizes the logical representation. Physical design problems are combinatorial in nature and of large problem sizes. Darwin observed that, as variations are introduced into a population with each new generation, the less-fit individuals tend to extinct in the competition of basic necessities. This survival of fittest principle leads to evolution in species. The objective of the Genetic Algorithms (GA) is to find an optimal solution to a problem .Since GA-s are heuristic procedures that can function as optimizers, they are not guaranteed to find the optimum, but are able to find acceptable solutions for a wide range of problems. This survey paper aims at a study on Efficient Algorithms for VLSI Physical design and observes the common traits of the superior contributions.Keywords: Genetic Algorithms, Physical Design, VLSI.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17432536 Application of Intuitionistic Fuzzy Cross Entropy Measure in Decision Making for Medical Diagnosis
Authors: Shikha Maheshwari, Amit Srivastava
Abstract:
In medical investigations, uncertainty is a major challenging problem in making decision for doctors/experts to identify the diseases with a common set of symptoms and also has been extensively increasing in medical diagnosis problems. The theory of cross entropy for intuitionistic fuzzy sets (IFS) is an effective approach in coping uncertainty in decision making for medical diagnosis problem. The main focus of this paper is to propose a new intuitionistic fuzzy cross entropy measure (IFCEM), which aid in reducing the uncertainty and doctors/experts will take their decision easily in context of patient’s disease. It is shown that the proposed measure has some elegant properties, which demonstrates its potency. Further, it is also exemplified in detail the efficiency and utility of the proposed measure by using a real life case study of diagnosis the disease in medical science.
Keywords: Intuitionistic fuzzy cross entropy (IFCEM), intuitionistic fuzzy set (IFS), medical diagnosis, uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20482535 Using the PARIS Method for Multiple Criteria Decision Making in Unmanned Combat Aircraft Evaluation and Selection
Authors: C. Ardil
Abstract:
Unmanned combat aircraft (UCA) are expanding significantly in several defense industries, along with artificial intelligence improvements in highly precise technology. UCA is crucial in military settings for targeting enemy elements, and objects. UCA is also utilized for highly precise reconnaissance and surveillance tasks. To select the best alternative for critical missions, a methodical and effective strategy for UCA selection is required. Multiple criteria decision-making (MCDM) methodologies are ideally equipped to handle the complexity of alternative aircraft selection. To analyze UCA alternatives for the selection process, an integrated methodology built on the objective criteria weights and preference analysis for reference ideal solution (PARIS). First, the weights of essential elements are determined using the average weight (AW), standard deviation (SW) and entropy weight (EW) approach. The weights of the evaluation criteria affect the decision-making process. The aircraft choices in the decision problem are then ranked using objective criteria weights along with the PARIS technique. The validation and sensitivity analysis of the proposed MCDM approach are discussed.
Keywords: unmanned combat aircraft (UCA), multiple criteria decision making, MCDM, PARIS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4852534 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values
Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi
Abstract:
A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.
Keywords: eXtreme Gradient Boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impairment, multiclass classification, ADNI, support vector machine, random forest.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9612533 The Riemann Barycenter Computation and Means of Several Matrices
Authors: Miklos Palfia
Abstract:
An iterative definition of any n variable mean function is given in this article, which iteratively uses the two-variable form of the corresponding two-variable mean function. This extension method omits recursivity which is an important improvement compared with certain recursive formulas given before by Ando-Li-Mathias, Petz- Temesi. Furthermore it is conjectured here that this iterative algorithm coincides with the solution of the Riemann centroid minimization problem. Certain simulations are given here to compare the convergence rate of the different algorithms given in the literature. These algorithms will be the gradient and the Newton mehod for the Riemann centroid computation.
Keywords: Means, matrix means, operator means, geometric mean, Riemannian center of mass.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17912532 Investigation on Novel Based Naturally-Inspired Swarm Intelligence Algorithms for Optimization Problems in Mobile Ad Hoc Networks
Authors: C. Rajan, K. Geetha, C. Rasi Priya, S. Geetha
Abstract:
Nature is the immense gifted source for solving complex problems. It always helps to find the optimal solution to solve the problem. Mobile Ad Hoc NETwork (MANET) is a wide research area of networks which has set of independent nodes. The characteristics involved in MANET’s are Dynamic, does not depend on any fixed infrastructure or centralized networks, High mobility. The Bio-Inspired algorithms are mimics the nature for solving optimization problems opening a new era in MANET. The typical Swarm Intelligence (SI) algorithms are Ant Colony Optimization (ACO), Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO), Modified Termite Algorithm, Bat Algorithm (BA), Wolf Search Algorithm (WSA) and so on. This work mainly concentrated on nature of MANET and behavior of nodes. Also it analyses various performance metrics such as throughput, QoS and End-to-End delay etc.
Keywords: Ant Colony Algorithm, Artificial Bee Colony algorithm, Bio-Inspired algorithm, Modified Termite Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24732531 Improving University Operations with Data Mining: Predicting Student Performance
Authors: Mladen Dragičević, Mirjana Pejić Bach, Vanja Šimičević
Abstract:
The purpose of this paper is to develop models that would enable predicting student success. These models could improve allocation of students among colleges and optimize the newly introduced model of government subsidies for higher education. For the purpose of collecting data, an anonymous survey was carried out in the last year of undergraduate degree student population using random sampling method. Decision trees were created of which two have been chosen that were most successful in predicting student success based on two criteria: Grade Point Average (GPA) and time that a student needs to finish the undergraduate program (time-to-degree). Decision trees have been shown as a good method of classification student success and they could be even more improved by increasing survey sample and developing specialized decision trees for each type of college. These types of methods have a big potential for use in decision support systems.
Keywords: Data mining, knowledge discovery in databases, prediction models, student success.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25422530 An Enhanced Distributed System to improve theTime Complexity of Binary Indexed Trees
Authors: Ahmed M. Elhabashy, A. Baes Mohamed, Abou El Nasr Mohamad
Abstract:
Distributed Computing Systems are usually considered the most suitable model for practical solutions of many parallel algorithms. In this paper an enhanced distributed system is presented to improve the time complexity of Binary Indexed Trees (BIT). The proposed system uses multi-uniform processors with identical architectures and a specially designed distributed memory system. The analysis of this system has shown that it has reduced the time complexity of the read query to O(Log(Log(N))), and the update query to constant complexity, while the naive solution has a time complexity of O(Log(N)) for both queries. The system was implemented and simulated using VHDL and Verilog Hardware Description Languages, with xilinx ISE 10.1, as the development environment and ModelSim 6.1c, similarly as the simulation tool. The simulation has shown that the overhead resulting by the wiring and communication between the system fragments could be fairly neglected, which makes it applicable to practically reach the maximum speed up offered by the proposed model.
Keywords: Binary Index Tree (BIT), Least Significant Bit (LSB), Parallel Adder (PA), Very High Speed Integrated Circuits HardwareDescription Language (VHDL), Distributed Parallel Computing System(DPCS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17752529 Evaluation of Robust Feature Descriptors for Texture Classification
Authors: Jia-Hong Lee, Mei-Yi Wu, Hsien-Tsung Kuo
Abstract:
Texture is an important characteristic in real and synthetic scenes. Texture analysis plays a critical role in inspecting surfaces and provides important techniques in a variety of applications. Although several descriptors have been presented to extract texture features, the development of object recognition is still a difficult task due to the complex aspects of texture. Recently, many robust and scaling-invariant image features such as SIFT, SURF and ORB have been successfully used in image retrieval and object recognition. In this paper, we have tried to compare the performance for texture classification using these feature descriptors with k-means clustering. Different classifiers including K-NN, Naive Bayes, Back Propagation Neural Network , Decision Tree and Kstar were applied in three texture image sets - UIUCTex, KTH-TIPS and Brodatz, respectively. Experimental results reveal SIFTS as the best average accuracy rate holder in UIUCTex, KTH-TIPS and SURF is advantaged in Brodatz texture set. BP neuro network works best in the test set classification among all used classifiers.Keywords: Texture classification, texture descriptor, SIFT, SURF, ORB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16032528 Decision Support for the Selection of Electric Power Plants Generated from Renewable Sources
Authors: Aumnad Phdungsilp, Teeradej Wuttipornpun
Abstract:
Decision support based upon risk analysis into comparison of the electricity generation from different renewable energy technologies can provide information about their effects on the environment and society. The aim of this paper is to develop the assessment framework regarding risks to health and environment, and the society-s benefits of the electric power plant generation from different renewable sources. The multicriteria framework to multiattribute risk analysis technique and the decision analysis interview technique are applied in order to support the decisionmaking process for the implementing renewable energy projects to the Bangkok case study. Having analyses the local conditions and appropriate technologies, five renewable power plants are postulated as options. As this work demonstrates, the analysis can provide a tool to aid decision-makers for achieving targets related to promote sustainable energy system.Keywords: Analytic Hierarchy Process, Bangkok, MultiattributeRisk Analysis, Renewable Energy Technology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19512527 Word Stemming Algorithms and Retrieval Effectiveness in Malay and Arabic Documents Retrieval Systems
Authors: Tengku Mohd T. Sembok
Abstract:
Documents retrieval in Information Retrieval Systems (IRS) is generally about understanding of information in the documents concern. The more the system able to understand the contents of documents the more effective will be the retrieval outcomes. But understanding of the contents is a very complex task. Conventional IRS apply algorithms that can only approximate the meaning of document contents through keywords approach using vector space model. Keywords may be unstemmed or stemmed. When keywords are stemmed and conflated in retrieving process, we are a step forwards in applying semantic technology in IRS. Word stemming is a process in morphological analysis under natural language processing, before syntactic and semantic analysis. We have developed algorithms for Malay and Arabic and incorporated stemming in our experimental systems in order to measure retrieval effectiveness. The results have shown that the retrieval effectiveness has increased when stemming is used in the systems.Keywords: Information Retrieval, Natural Language Processing, Artificial Intelligence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22612526 Apoptosis Inspired Intrusion Detection System
Authors: R. Sridevi, G. Jagajothi
Abstract:
Artificial Immune Systems (AIS), inspired by the human immune system, are algorithms and mechanisms which are self-adaptive and self-learning classifiers capable of recognizing and classifying by learning, long-term memory and association. Unlike other human system inspired techniques like genetic algorithms and neural networks, AIS includes a range of algorithms modeling on different immune mechanism of the body. In this paper, a mechanism of a human immune system based on apoptosis is adopted to build an Intrusion Detection System (IDS) to protect computer networks. Features are selected from network traffic using Fisher Score. Based on the selected features, the record/connection is classified as either an attack or normal traffic by the proposed methodology. Simulation results demonstrates that the proposed AIS based on apoptosis performs better than existing AIS for intrusion detection.
Keywords: Apoptosis, Artificial Immune System (AIS), Fisher Score, KDD dataset, Network intrusion detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21932525 On the Fast Convergence of DD-LMS DFE Using a Good Strategy Initialization
Authors: Y.Ben Jemaa, M.Jaidane
Abstract:
In wireless communication system, a Decision Feedback Equalizer (DFE) to cancel the intersymbol interference (ISI) is required. In this paper, an exact convergence analysis of the (DFE) adapted by the Least Mean Square (LMS) algorithm during the training phase is derived by taking into account the finite alphabet context of data transmission. This allows us to determine the shortest training sequence that allows to reach a given Mean Square Error (MSE). With the intention of avoiding the problem of ill-convergence, the paper proposes an initialization strategy for the blind decision directed (DD) algorithm. This then yields a semi-blind DFE with high speed and good convergence.
Keywords: Adaptive Decision Feedback Equalizer, PerformanceAnalysis, Finite Alphabet Case, Ill-Convergence, Convergence speed.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20722524 An Auxiliary Technique for Coronary Heart Disease Prediction by Analyzing ECG Based on ResNet and Bi-LSTM
Authors: Yang Zhang, Jian He
Abstract:
Heart disease is one of the leading causes of death in the world, and coronary heart disease (CHD) is one of the major heart diseases. Electrocardiogram (ECG) is widely used in the detection of heart diseases, but the traditional manual method for CHD prediction by analyzing ECG requires lots of professional knowledge for doctors. This paper presents sliding window and continuous wavelet transform (CWT) to transform ECG signals into images, and then ResNet and Bi-LSTM are introduced to build the ECG feature extraction network (namely ECGNet). At last, an auxiliary system for CHD prediction was developed based on modified ResNet18 and Bi-LSTM, and the public ECG dataset of CHD from MIMIC-3 was used to train and test the system. The experimental results show that the accuracy of the method is 83%, and the F1-score is 83%. Compared with the available methods for CHD prediction based on ECG, such as kNN, decision tree, VGGNet, etc., this method not only improves the prediction accuracy but also could avoid the degradation phenomenon of the deep learning network.
Keywords: Bi-LSTM, CHD, coronary heart disease, ECG, electrocardiogram, ResNet, sliding window.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3452523 A Developmental Survey of Local Stereo Matching Algorithms
Authors: André Smith, Amr Abdel-Dayem
Abstract:
This paper presents an overview of the history and development of stereo matching algorithms. Details from its inception, up to relatively recent techniques are described, noting challenges that have been surmounted across these past decades. Different components of these are explored, though focus is directed towards the local matching techniques. While global approaches have existed for some time, and demonstrated greater accuracy than their counterparts, they are generally quite slow. Many strides have been made more recently, allowing local methods to catch up in terms of accuracy, without sacrificing the overall performance.Keywords: Developmental survey, local stereo matching, stereo correspondence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14712522 3D Dense Correspondence for 3D Dense Morphable Face Shape Model
Authors: Tae in Seol, Sun-Tae Chung, Seongwon Cho
Abstract:
Realistic 3D face model is desired in various applications such as face recognition, games, avatars, animations, and etc. Construction of 3D face model is composed of 1) building a face shape model and 2) rendering the face shape model. Thus, building a realistic 3D face shape model is an essential step for realistic 3D face model. Recently, 3D morphable model is successfully introduced to deal with the various human face shapes. 3D dense correspondence problem should be precedently resolved for constructing a realistic 3D dense morphable face shape model. Several approaches to 3D dense correspondence problem in 3D face modeling have been proposed previously, and among them optical flow based algorithms and TPS (Thin Plate Spline) based algorithms are representative. Optical flow based algorithms require texture information of faces, which is sensitive to variation of illumination. In TPS based algorithms proposed so far, TPS process is performed on the 2D projection representation in cylindrical coordinates of the 3D face data, not directly on the 3D face data and thus errors due to distortion in data during 2D TPS process may be inevitable. In this paper, we propose a new 3D dense correspondence algorithm for 3D dense morphable face shape modeling. The proposed algorithm does not need texture information and applies TPS directly on 3D face data. Through construction procedures, it is observed that the proposed algorithm constructs realistic 3D face morphable model reliably and fast.Keywords: 3D Dense Correspondence, 3D Morphable Face Shape Model, 3D Face Modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21932521 Composite Kernels for Public Emotion Recognition from Twitter
Authors: Chien-Hung Chen, Yan-Chun Hsing, Yung-Chun Chang
Abstract:
The Internet has grown into a powerful medium for information dispersion and social interaction that leads to a rapid growth of social media which allows users to easily post their emotions and perspectives regarding certain topics online. Our research aims at using natural language processing and text mining techniques to explore the public emotions expressed on Twitter by analyzing the sentiment behind tweets. In this paper, we propose a composite kernel method that integrates tree kernel with the linear kernel to simultaneously exploit both the tree representation and the distributed emotion keyword representation to analyze the syntactic and content information in tweets. The experiment results demonstrate that our method can effectively detect public emotion of tweets while outperforming the other compared methods.
Keywords: Public emotion recognition, natural language processing, composite kernel, sentiment analysis, text mining.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7742520 Spread Spectrum Code Estimation by Genetic Algorithm
Authors: V. R. Asghari, M. Ardebilipour
Abstract:
In the context of spectrum surveillance, a method to recover the code of spread spectrum signal is presented, whereas the receiver has no knowledge of the transmitter-s spreading sequence. The approach is based on a genetic algorithm (GA), which is forced to model the received signal. Genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems. Experimental results show that the method provides a good estimation, even when the signal power is below the noise power.Keywords: Code estimation, genetic algorithms, spread spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15742519 A Comparative Analysis of Multiple Criteria Decision Making Analysis Methods for Strategic, Tactical, and Operational Decisions in Military Fighter Aircraft Selection
Authors: C. Ardil
Abstract:
This paper considers a comparative analysis of multiple criteria decision making analysis methods for strategic, tactical, and operational decisions in military fighter aircraft selection for the air force fleet planning. The evaluation criteria governing the decision analysis process are determined from the literature for the three existing military combat aircraft. Military fighter aircraft selection problem is structured using "preference analysis for reference ideal solution (PARIS)” approach in multiple criteria decision analysis (MCDMA). Systematic comparisons were made with existing MCDMA methods (PARIS, and TOPSIS) to verify the stability and accuracy of the results obtained. The proposed integrated MCDMA systematic approach is expected to address the issues encountered in the aircraft selection process. The comparative analysis results show that the proposed method is an effective and accurate tool that can help analysts make better strategic, tactical, and operational decisions.
Keywords: aircraft, military fighter aircraft selection, multiple criteria decision making, multiple criteria decision making analysis, mean weight, entropy weight, MCDMA, PARIS, TOPSIS, Saab Gripen, Dassault Rafale, Eurofighter Typhoon
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5752518 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types
Authors: Chaghoub Soraya, Zhang Xiaoyan
Abstract:
This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.Keywords: Approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6022517 Non-Population Search Algorithms for Capacitated Material Requirement Planning in Multi-Stage Assembly Flow Shop with Alternative Machines
Authors: Watcharapan Sukkerd, Teeradej Wuttipornpun
Abstract:
This paper aims to present non-population search algorithms called tabu search (TS), simulated annealing (SA) and variable neighborhood search (VNS) to minimize the total cost of capacitated MRP problem in multi-stage assembly flow shop with two alternative machines. There are three main steps for the algorithm. Firstly, an initial sequence of orders is constructed by a simple due date-based dispatching rule. Secondly, the sequence of orders is repeatedly improved to reduce the total cost by applying TS, SA and VNS separately. Finally, the total cost is further reduced by optimizing the start time of each operation using the linear programming (LP) model. Parameters of the algorithm are tuned by using real data from automotive companies. The result shows that VNS significantly outperforms TS, SA and the existing algorithm.
Keywords: Capacitated MRP, non-population search algorithms, linear programming, assembly flow shop.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9612516 Block Sorting: A New Characterization and a New Heuristic
Authors: Swapnoneel Roy, Ashok Kumar Thakur, Minhazur Rahman
Abstract:
The Block Sorting problem is to sort a given permutation moving blocks. A block is defined as a substring of the given permutation, which is also a substring of the identity permutation. Block Sorting has been proved to be NP-Hard. Until now two different 2-Approximation algorithms have been presented for block sorting. These are the best known algorithms for Block Sorting till date. In this work we present a different characterization of Block Sorting in terms of a transposition cycle graph. Then we suggest a heuristic, which we show to exhibit a 2-approximation performance guarantee for most permutations.Keywords: Block Sorting, Optical Character Recognition, Genome Rearrangements, Sorting Primitives, ApproximationAlgorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21442515 Development of Decision Support System for House Evaluation and Purchasing
Authors: Chia-Yu Hsu, Julaimin Goh, Pei-Chann Chang
Abstract:
Home is important for Chinese people. Because the information regarding the house attributes and surrounding environments is incomplete in most real estate agency, most house buyers are difficult to consider the overall factors effectively and only can search candidates by sorting-based approach. This study aims to develop a decision support system for housing purchasing, in which surrounding facilities of each house are quantified. Then, all considered house factors and customer preferences are incorporated into Simple Multi-Attribute Ranking Technique (SMART) to support the housing evaluation. To evaluate the validity of proposed approach, an empirical study was conducted from a real estate agency. Based on the customer requirement and preferences, the proposed approach can identify better candidate house with consider the overall house attributes and surrounding facilities.Keywords: decision support system, real estate, decision analysis, housing evaluation, SMART
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22072514 Aircraft Selection Using Multiple Criteria Decision Making Analysis Method with Different Data Normalization Techniques
Authors: C. Ardil
Abstract:
This paper presents an original application of multiple criteria decision making analysis theory to the evaluation of aircraft selection problem. The selection of an optimal, efficient and reliable fleet, network and operations planning policy is one of the most important factors in aircraft selection problem. Given that decision making in aircraft selection involves the consideration of a number of opposite criteria and possible solutions, such a selection can be considered as a multiple criteria decision making analysis problem. This study presents a new integrated approach to decision making by considering the multiple criteria utility theory and the maximal regret minimization theory methods as well as aircraft technical, economical, and environmental aspects. Multiple criteria decision making analysis method uses different normalization techniques to allow criteria to be aggregated with qualitative and quantitative data of the decision problem. Therefore, selecting a suitable normalization technique for the model is also a challenge to provide data aggregation for the aircraft selection problem. To compare the impact of different normalization techniques on the decision problem, the vector, linear (sum), linear (max), and linear (max-min) data normalization techniques were identified to evaluate aircraft selection problem. As a logical implication of the proposed approach, it enhances the decision making process through enabling the decision maker to: (i) use higher level knowledge regarding the selection of criteria weights and the proposed technique, (ii) estimate the ranking of an alternative, under different data normalization techniques and integrated criteria weights after a posteriori analysis of the final rankings of alternatives. A set of commercial passenger aircraft were considered in order to illustrate the proposed approach. The obtained results of the proposed approach were compared using Spearman's rho tests. An analysis of the final rank stability with respect to the changes in criteria weights was also performed so as to assess the sensitivity of the alternative rankings obtained by the application of different data normalization techniques and the proposed approach.
Keywords: Normalization Techniques, Aircraft Selection, Multiple Criteria Decision Making, Multiple Criteria Decision Making Analysis, MCDMA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5952513 Financial Information and Collective Bargaining: Conflicting or Complementing?
Authors: Humayun Murshed, Shibly Abdullah
Abstract:
The research conducted in early seventies apparently assumed the existence of a universal decision model for union negotiators and furthermore tended to regard financial information as a ‘neutral’ input into a rational decision making process. However, research in the eighties began to question the neutrality of financial information as an input in collective bargaining rather viewing it as a potentially effective means for controlling the labour force. Furthermore, this later research also started challenging the simplistic assumptions relating particularly to union objectives which have underpinned the earlier search for universal union decision models. Despite the above developments there seems to be a dearth of studies in developing countries concerning the use of financial information in collective bargaining. This paper seeks to begin to remedy this deficiency. Utilising a case study approach based on two enterprises, one in the public sector and the other a multinational, the universal decision model is rejected and it is argued that the decision whether or not to use financial information is a contingent one and such a contingency is largely defined by the context and environment in which both union and management negotiators work. An attempt is also made to identify the factors constraining as well as promoting the use of financial information in collective bargaining, these being regarded as unique to the organisations within which the case studies are conducted.
Keywords: Collective Bargaining, Developing Countries, Disclosures, Financial Information.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15862512 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering
Authors: Yunus Doğan, Ahmet Durap
Abstract:
Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.
Keywords: Clustering algorithms, coastal engineering, data mining, data summarization, statistical methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12472511 Genetic Algorithms for Feature Generation in the Context of Audio Classification
Authors: José A. Menezes, Giordano Cabral, Bruno T. Gomes
Abstract:
Choosing good features is an essential part of machine learning. Recent techniques aim to automate this process. For instance, feature learning intends to learn the transformation of raw data into a useful representation to machine learning tasks. In automatic audio classification tasks, this is interesting since the audio, usually complex information, needs to be transformed into a computationally convenient input to process. Another technique tries to generate features by searching a feature space. Genetic algorithms, for instance, have being used to generate audio features by combining or modifying them. We find this approach particularly interesting and, despite the undeniable advances of feature learning approaches, we wanted to take a step forward in the use of genetic algorithms to find audio features, combining them with more conventional methods, like PCA, and inserting search control mechanisms, such as constraints over a confusion matrix. This work presents the results obtained on particular audio classification problems.
Keywords: Feature generation, feature learning, genetic algorithm, music information retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10842510 Robot Operating System-Based SLAM for a Gazebo-Simulated Turtlebot2 in 2d Indoor Environment with Cartographer Algorithm
Authors: Wilayat Ali, Li Sheng, Waleed Ahmed
Abstract:
The ability of the robot to make simultaneously map of the environment and localize itself with respect to that environment is the most important element of mobile robots. To solve SLAM many algorithms could be utilized to build up the SLAM process and SLAM is a developing area in Robotics research. Robot Operating System (ROS) is one of the frameworks which provide multiple algorithm nodes to work with and provide a transmission layer to robots. Manyof these algorithms extensively in use are Hector SLAM, Gmapping and Cartographer SLAM. This paper describes a ROS-based Simultaneous localization and mapping (SLAM) library Google Cartographer mapping, which is open-source algorithm. The algorithm was applied to create a map using laser and pose data from 2d Lidar that was placed on a mobile robot. The model robot uses the gazebo package and simulated in Rviz. Our research work's primary goal is to obtain mapping through Cartographer SLAM algorithm in a static indoor environment. From our research, it is shown that for indoor environments cartographer is an applicable algorithm to generate 2d maps with LIDAR placed on mobile robot because it uses both odometry and poses estimation. The algorithm has been evaluated and maps are constructed against the SLAM algorithms presented by Turtlebot2 in the static indoor environment.Keywords: SLAM, ROS, navigation, localization and mapping, Gazebo, Rviz, Turtlebot2, SLAM algorithms, 2d Indoor environment, Cartographer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12392509 Region-Based Segmentation of Generic Video Scenes Indexing
Authors: Aree A. Mohammed
Abstract:
In this work we develop an object extraction method and propose efficient algorithms for object motion characterization. The set of proposed tools serves as a basis for development of objectbased functionalities for manipulation of video content. The estimators by different algorithms are compared in terms of quality and performance and tested on real video sequences. The proposed method will be useful for the latest standards of encoding and description of multimedia content – MPEG4 and MPEG7.Keywords: Object extraction, Video indexing, Segmentation, Optical flow, Motion estimators.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13562508 Aircraft Supplier Selection Process with Fuzzy Proximity Measure Method using Multiple Criteria Group Decision Making Analysis
Authors: C. Ardil
Abstract:
Being effective in every organizational activity has become necessary due to the escalating level of competition in all areas of corporate life. In the context of supply chain management, aircraft supplier selection is currently one of the most crucial activities. It is possible to choose the best aircraft supplier and deliver efficiency in terms of cost, quality, delivery time, economic status, and institutionalization if a systematic supplier selection approach is used. In this study, an effective multiple criteria decision-making methodology, proximity measure method (PMM), is used within a fuzzy environment based on the vague structure of the real working environment. The best appropriate aircraft suppliers are identified and ranked after the proposed multiple criteria decision making technique is used in a real-life scenario.
Keywords: Aircraft supplier selection, multiple criteria decision making, fuzzy sets, proximity measure method, Minkowski distance family function, Hausdorff distance function, PMM, MCDM
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 355