Search results for: Genetic algorithm
1925 Intelligent Rescheduling Trains for Air Pollution Management
Authors: Kainat Affrin, P. Reshma, G. Narendra Kumar
Abstract:
Optimization of timetable is the need of the day for the rescheduling and routing of trains in real time. Trains are scheduled in parallel with the road transport vehicles to the same destination. As the number of trains is restricted due to single track, customers usually opt for road transport to use frequently. The air pollution increases as the density of vehicles on road transport is increased. Use of an alternate mode of transport like train helps in reducing air-pollution. This paper mainly aims at attracting the passengers to Train transport by proper rescheduling of trains using hybrid of stop-skip algorithm and iterative convex programming algorithm. Rescheduling of train bi-directionally is achieved on a single track with dynamic dual time and varying stops. Introduction of more trains attract customers to use rail transport frequently, thereby decreasing the pollution. The results are simulated using Network Simulator (NS-2).Keywords: Air pollution, routing protocol, network simulator, rescheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9831924 User Pattern Learning Algorithm based MDSS(Medical Decision Support System) Framework under Ubiquitous
Authors: Insung Jung, Gi-Nam Wang
Abstract:
In this paper, we present user pattern learning algorithm based MDSS (Medical Decision support system) under ubiquitous. Most of researches are focus on hardware system, hospital management and whole concept of ubiquitous environment even though it is hard to implement. Our objective of this paper is to design a MDSS framework. It helps to patient for medical treatment and prevention of the high risk patient (COPD, heart disease, Diabetes). This framework consist database, CAD (Computer Aided diagnosis support system) and CAP (computer aided user vital sign prediction system). It can be applied to develop user pattern learning algorithm based MDSS for homecare and silver town service. Especially this CAD has wise decision making competency. It compares current vital sign with user-s normal condition pattern data. In addition, the CAP computes user vital sign prediction using past data of the patient. The novel approach is using neural network method, wireless vital sign acquisition devices and personal computer DB system. An intelligent agent based MDSS will help elder people and high risk patients to prevent sudden death and disease, the physician to get the online access to patients- data, the plan of medication service priority (e.g. emergency case).Keywords: Neural network, U-healthcare, MDSS, CAP, DSS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18381923 Modified Energy and Link Failure Recovery Routing Algorithm for Wireless Sensor Network
Authors: M. Jayekumar, V. Nagarajan
Abstract:
Wireless sensor network finds role in environmental monitoring, industrial applications, surveillance applications, health monitoring and other supervisory applications. Sensing devices form the basic operational unit of the network that is self-battery powered with limited life time. Sensor node spends its limited energy for transmission, reception, routing and sensing information. Frequent energy utilization for the above mentioned process leads to network lifetime degradation. To enhance energy efficiency and network lifetime, we propose a modified energy optimization and node recovery post failure method, Energy-Link Failure Recovery Routing (E-LFRR) algorithm. In our E-LFRR algorithm, two phases namely, Monitored Transmission phase and Replaced Transmission phase are devised to combat worst case link failure conditions. In Monitored Transmission phase, the Actuator Node monitors and identifies suitable nodes for shortest path transmission. The Replaced Transmission phase dispatches the energy draining node at early stage from the active link and replaces it with the new node that has sufficient energy. Simulation results illustrate that this combined methodology reduces overhead, energy consumption, delay and maintains considerable amount of alive nodes thereby enhancing the network performance.
Keywords: Actuator node, energy efficient routing, energy hole, link failure recovery, link utilization, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11921922 Identification of a PWA Model of a Batch Reactor for Model Predictive Control
Authors: Gorazd Karer, Igor Skrjanc, Borut Zupancic
Abstract:
The complex hybrid and nonlinear nature of many processes that are met in practice causes problems with both structure modelling and parameter identification; therefore, obtaining a model that is suitable for MPC is often a difficult task. The basic idea of this paper is to present an identification method for a piecewise affine (PWA) model based on a fuzzy clustering algorithm. First we introduce the PWA model. Next, we tackle the identification method. We treat the fuzzy clustering algorithm, deal with the projections of the fuzzy clusters into the input space of the PWA model and explain the estimation of the parameters of the PWA model by means of a modified least-squares method. Furthermore, we verify the usability of the proposed identification approach on a hybrid nonlinear batch reactor example. The result suggest that the batch reactor can be efficiently identified and thus formulated as a PWA model, which can eventually be used for model predictive control purposes.
Keywords: Batch reactor, fuzzy clustering, hybrid systems, identification, nonlinear systems, PWA systems.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21951921 Adaptive Hierarchical Key Structure Generation for Key Management in Wireless Sensor Networks using A*
Authors: Jin Myoung Kim, Tae Ho Cho
Abstract:
Wireless Sensor networks have a wide spectrum of civil and military applications that call for secure communication such as the terrorist tracking, target surveillance in hostile environments. For the secure communication in these application areas, we propose a method for generating a hierarchical key structure for the efficient group key management. In this paper, we apply A* algorithm in generating a hierarchical key structure by considering the history data of the ratio of addition and eviction of sensor nodes in a location where sensor nodes are deployed. Thus generated key tree structure provides an efficient way of managing the group key in terms of energy consumption when addition and eviction event occurs. A* algorithm tries to minimize the number of messages needed for group key management by the history data. The experimentation with the tree shows efficiency of the proposed method.
Keywords: Heuristic search, key management, security, sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16841920 Solving Directional Overcurrent Relay Coordination Problem Using Artificial Bees Colony
Authors: M. H. Hussain, I. Musirin, A. F. Abidin, S. R. A. Rahim
Abstract:
This paper presents the implementation of Artificial Bees Colony (ABC) algorithm in solving Directional OverCurrent Relays (DOCRs) coordination problem for near-end faults occurring in fixed network topology. The coordination optimization of DOCRs is formulated as linear programming (LP) problem. The objective function is introduced to minimize the operating time of the associated relay which depends on the time multiplier setting. The proposed technique is to taken as a technique for comparison purpose in order to highlight its superiority. The proposed algorithms have been tested successfully on 8 bus test system. The simulation results demonstrated that the ABC algorithm which has been proved to have good search ability is capable in dealing with constraint optimization problems.
Keywords: Artificial bees colony, directional overcurrent relay coordination problem, relay settings, time multiplier setting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 35321919 A Comparative Study of Page Ranking Algorithms for Information Retrieval
Authors: Ashutosh Kumar Singh, Ravi Kumar P
Abstract:
This paper gives an introduction to Web mining, then describes Web Structure mining in detail, and explores the data structure used by the Web. This paper also explores different Page Rank algorithms and compare those algorithms used for Information Retrieval. In Web Mining, the basics of Web mining and the Web mining categories are explained. Different Page Rank based algorithms like PageRank (PR), WPR (Weighted PageRank), HITS (Hyperlink-Induced Topic Search), DistanceRank and DirichletRank algorithms are discussed and compared. PageRanks are calculated for PageRank and Weighted PageRank algorithms for a given hyperlink structure. Simulation Program is developed for PageRank algorithm because PageRank is the only ranking algorithm implemented in the search engine (Google). The outputs are shown in a table and chart format.Keywords: Web Mining, Web Structure, Web Graph, LinkAnalysis, PageRank, Weighted PageRank, HITS, DistanceRank, DirichletRank,
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28361918 Development of a Methodology for Processing of Drilling Operations
Authors: Majid Tolouei-Rad, Ankit Shah
Abstract:
Drilling is the most common machining operation and it forms the highest machining cost in many manufacturing activities including automotive engine production. The outcome of this operation depends upon many factors including utilization of proper cutting tool geometry, cutting tool material and the type of coating used to improve hardness and resistance to wear, and also cutting parameters. With the availability of a large array of tool geometries, materials and coatings, is has become a challenging task to select the best tool and cutting parameters that would result in the lowest machining cost or highest profit rate. This paper describes an algorithm developed to help achieve good performances in drilling operations by automatically determination of proper cutting tools and cutting parameters. It also helps determine machining sequences resulting in minimum tool changes that would eventually reduce machining time and cost where multiple tools are used.
Keywords: Cutting tool, drilling, machining, algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33501917 Global Chaos Synchronization of Identical and Nonidentical Chaotic Systems Using Only Two Nonlinear Controllers
Authors: Azizan Bin Saaban, Adyda Binti Ibrahim, Mohammad Shehzad, Israr Ahmad
Abstract:
In chaos synchronization, the main goal is to design such controller(s) that synchronizes the states of master and slave system asymptotically globally. This paper studied and investigated the synchronization problem of two identical Chen, and identical Tigan chaotic systems and two non-identical Chen and Tigan chaotic systems using Non-linear active control algorithm. In this study, based on Lyapunov stability theory and using non-linear active control algorithm, it has been shown that the proposed schemes have excellent transient performance using only two nonlinear controllers and have shown analytically as well as graphically that synchronization is asymptotically globally stable.
Keywords: Nonlinear Active Control, Chen and Tigan Chaotic systems, Lyapunov Stability theory, Synchronization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19631916 Grouping-Based Job Scheduling Model In Grid Computing
Authors: Vishnu Kant Soni, Raksha Sharma, Manoj Kumar Mishra
Abstract:
Grid computing is a high performance computing environment to solve larger scale computational applications. Grid computing contains resource management, job scheduling, security problems, information management and so on. Job scheduling is a fundamental and important issue in achieving high performance in grid computing systems. However, it is a big challenge to design an efficient scheduler and its implementation. In Grid Computing, there is a need of further improvement in Job Scheduling algorithm to schedule the light-weight or small jobs into a coarse-grained or group of jobs, which will reduce the communication time, processing time and enhance resource utilization. This Grouping strategy considers the processing power, memory-size and bandwidth requirements of each job to realize the real grid system. The experimental results demonstrate that the proposed scheduling algorithm efficiently reduces the processing time of jobs in comparison to others.Keywords: Grid computing, Job grouping and Jobscheduling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19481915 Computing the Loop Bound in Iterative Data Flow Graphs Using Natural Token Flow
Authors: Ali Shatnawi
Abstract:
Signal processing applications which are iterative in nature are best represented by data flow graphs (DFG). In these applications, the maximum sampling frequency is dependent on the topology of the DFG, the cyclic dependencies in particular. The determination of the iteration bound, which is the reciprocal of the maximum sampling frequency, is critical in the process of hardware implementation of signal processing applications. In this paper, a novel technique to compute the iteration bound is proposed. This technique is different from all previously proposed techniques, in the sense that it is based on the natural flow of tokens into the DFG rather than the topology of the graph. The proposed algorithm has lower run-time complexity than all known algorithms. The performance of the proposed algorithm is illustrated through analytical analysis of the time complexity, as well as through simulation of some benchmark problems.Keywords: Data flow graph, Iteration period bound, Rateoptimalscheduling, Recursive DSP algorithms.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25621914 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI
Authors: Hae-Yeoun Lee
Abstract:
Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring, which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.
Keywords: Cardiac MRI, Graph searching, Left ventricle segmentation, K-means clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20941913 Virtual Machines Cooperation for Impatient Jobs under Cloud Paradigm
Authors: Nawfal A. Mehdi, Ali Mamat, Hamidah Ibrahim, Shamala K. Syrmabn
Abstract:
The increase on the demand of IT resources diverts the enterprises to use the cloud as a cheap and scalable solution. Cloud computing promises achieved by using the virtual machine as a basic unite of computation. However, the virtual machine pre-defined settings might be not enough to handle jobs QoS requirements. This paper addresses the problem of mapping jobs have critical start deadlines to virtual machines that have predefined specifications. These virtual machines hosted by physical machines and shared a fixed amount of bandwidth. This paper proposed an algorithm that uses the idle virtual machines bandwidth to increase the quote of other virtual machines nominated as executors to urgent jobs. An algorithm with empirical study have been given to evaluate the impact of the proposed model on impatient jobs. The results show the importance of dynamic bandwidth allocation in virtualized environment and its affect on throughput metric.Keywords: Insufficient bandwidth, virtual machine, cloudprovider, impatient jobs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16801912 Forecasting Foreign Direct Investment with Modified Diffusion Model
Authors: Bi-Huei Tsai
Abstract:
Prior research has not effectively investigated how the profitability of Chinese branches affect FDIs in China [1, 2], so this study for the first time incorporates realistic earnings information to systematically investigate effects of innovation, imitation, and profit factors of FDI diffusions from Taiwan to China. Our nonlinear least square (NLS) model, which incorporates earnings factors, forms a nonlinear ordinary differential equation (ODE) in numerical simulation programs. The model parameters are obtained through a genetic algorithms (GA) technique and then optimized with the collected data for the best accuracy. Particularly, Taiwanese regulatory FDI restrictions are also considered in our modified model to meet the realistic conditions. To validate the model-s effectiveness, this investigation compares the prediction accuracy of modified model with the conventional diffusion model, which does not take account of the profitability factors. The results clearly demonstrate the internal influence to be positive, as early FDI adopters- consistent praises of FDI attract potential firms to make the same move. The former erects a behavior model for the latter to imitate their foreign investment decision. Particularly, the results of modified diffusion models show that the earnings from Chinese branches are positively related to the internal influence. In general, the imitating tendency of potential consumers is substantially hindered by the losses in the Chinese branches, and these firms would invest less into China. The FDI inflow extension depends on earnings of Chinese branches, and companies will adjust their FDI strategies based on the returns. Since this research has proved that earning is an influential factor on FDI dynamics, our revised model explicitly performs superior in prediction ability than conventional diffusion model.Keywords: diffusion model, genetic algorithms, nonlinear leastsquares (NLS) model, prediction error.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16131911 Refitting Equations for Peak Ground Acceleration in Light of the PF-L Database
Authors: M. Breška, I. Peruš, V. Stankovski
Abstract:
The number of Ground Motion Prediction Equations (GMPEs) used for predicting peak ground acceleration (PGA) and the number of earthquake recordings that have been used for fitting these equations has increased in the past decades. The current PF-L database contains 3550 recordings. Since the GMPEs frequently model the peak ground acceleration the goal of the present study was to refit a selection of 44 of the existing equation models for PGA in light of the latest data. The algorithm Levenberg-Marquardt was used for fitting the coefficients of the equations and the results are evaluated both quantitatively by presenting the root mean squared error (RMSE) and qualitatively by drawing graphs of the five best fitted equations. The RMSE was found to be as low as 0.08 for the best equation models. The newly estimated coefficients vary from the values published in the original works.
Keywords: Ground Motion Prediction Equations, Levenberg-Marquardt algorithm, refitting PF-L database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14951910 Studies on Properties of Knowledge Dependency and Reduction Algorithm in Tolerance Rough Set Model
Authors: Chen Wu, Lijuan Wang
Abstract:
Relation between tolerance class and indispensable attribute and knowledge dependency in rough set model with tolerance relation is explored. After giving definitions and concepts of knowledge dependency and knowledge dependency degree for incomplete information system in tolerance rough set model by distinguishing decision attribute containing missing attribute value or not, the result of maintaining reflectivity, transitivity, augmentation, decomposition law and merge law for complete knowledge dependency is proved. Knowledge dependency degrees (not complete knowledge dependency degrees) only satisfy some laws after transitivity, augmentation and decomposition operations. An algorithm to solve attribute reduction in an incomplete decision table is designed. The correctness is checked by an example.Keywords: Incomplete information system, rough set, tolerance relation, knowledge dependence, attribute reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7281909 Traceable Watermarking System using SoC for Digital Cinema Delivery
Authors: Sadi Vural, Hiromi Tomii, Hironori Yamauchi
Abstract:
As the development of digital technology is increasing, Digital cinema is getting more spread. However, content copy and attack against the digital cinema becomes a serious problem. To solve the above security problem, we propose “Additional Watermarking" for digital cinema delivery system. With this proposed “Additional watermarking" method, we protect content copyrights at encoder and user side information at decoder. It realizes the traceability of the watermark embedded at encoder. The watermark is embedded into the random-selected frames using Hash function. Using it, the embedding position is distributed by Hash Function so that third parties do not break off the watermarking algorithm. Finally, our experimental results show that proposed method is much better than the convenient watermarking techniques in terms of robustness, image quality and its simple but unbreakable algorithm.Keywords: Decoder, Digital content, JPEG2000 Frame, System-On-Chip and additional watermark.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16861908 Optimal and Critical Path Analysis of State Transportation Network Using Neo4J
Authors: Pallavi Bhogaram, Xiaolong Wu, Min He, Onyedikachi Okenwa
Abstract:
A transportation network is a realization of a spatial network, describing a structure which permits either vehicular movement or flow of some commodity. Examples include road networks, railways, air routes, pipelines, and many more. The transportation network plays a vital role in maintaining the vigor of the nation’s economy. Hence, ensuring the network stays resilient all the time, especially in the face of challenges such as heavy traffic loads and large scale natural disasters, is of utmost importance. In this paper, we used the Neo4j application to develop the graph. Neo4j is the world's leading open-source, NoSQL, a native graph database that implements an ACID-compliant transactional backend to applications. The Southern California network model is developed using the Neo4j application and obtained the most critical and optimal nodes and paths in the network using centrality algorithms. The edge betweenness centrality algorithm calculates the critical or optimal paths using Yen's k-shortest paths algorithm, and the node betweenness centrality algorithm calculates the amount of influence a node has over the network. The preliminary study results confirm that the Neo4j application can be a suitable tool to study the important nodes and the critical paths for the major congested metropolitan area.
Keywords: Transportation network, critical path, connectivity reliability, network model, Neo4J application, optimal path, critical path, edge betweenness centrality index, node betweenness centrality index, Yen’s k-shortest paths.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8551907 Two DEA Based Ant Algorithms for CMS Problems
Authors: Hossein Ali Akbarpour, Fatemeh Dadkhah
Abstract:
This paper considers a multi criteria cell formation problem in Cellular Manufacturing System (CMS). Minimizing the number of voids and exceptional elements in cells simultaneously are two proposed objective functions. This problem is an Np-hard problem according to the literature, and therefore, we can-t find the optimal solution by an exact method. In this paper we developed two ant algorithms, Ant Colony Optimization (ACO) and Max-Min Ant System (MMAS), based on Data Envelopment Analysis (DEA). Both of them try to find the efficient solutions based on efficiency concept in DEA. Each artificial ant is considered as a Decision Making Unit (DMU). For each DMU we considered two inputs, the values of objective functions, and one output, the value of one for all of them. In order to evaluate performance of proposed methods we provided an experimental design with some empirical problem in three different sizes, small, medium and large. We defined three different criteria that show which algorithm has the best performance.Keywords: Ant algorithm, Cellular manufacturing system, Data envelopment analysis, Efficiency
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16541906 A Hybrid Ontology Based Approach for Ranking Documents
Authors: Sarah Motiee, Azadeh Nematzadeh, Mehrnoush Shamsfard
Abstract:
Increasing growth of information volume in the internet causes an increasing need to develop new (semi)automatic methods for retrieval of documents and ranking them according to their relevance to the user query. In this paper, after a brief review on ranking models, a new ontology based approach for ranking HTML documents is proposed and evaluated in various circumstances. Our approach is a combination of conceptual, statistical and linguistic methods. This combination reserves the precision of ranking without loosing the speed. Our approach exploits natural language processing techniques to extract phrases from documents and the query and doing stemming on words. Then an ontology based conceptual method will be used to annotate documents and expand the query. To expand a query the spread activation algorithm is improved so that the expansion can be done flexible and in various aspects. The annotated documents and the expanded query will be processed to compute the relevance degree exploiting statistical methods. The outstanding features of our approach are (1) combining conceptual, statistical and linguistic features of documents, (2) expanding the query with its related concepts before comparing to documents, (3) extracting and using both words and phrases to compute relevance degree, (4) improving the spread activation algorithm to do the expansion based on weighted combination of different conceptual relationships and (5) allowing variable document vector dimensions. A ranking system called ORank is developed to implement and test the proposed model. The test results will be included at the end of the paper.Keywords: Document ranking, Ontology, Spread activation algorithm, Annotation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16301905 A New Heuristic Statistical Methodology for Optimizing Queuing Networks Using Discreet Event Simulation
Authors: Mohamad Mahdavi
Abstract:
Most of the real queuing systems include special properties and constraints, which can not be analyzed directly by using the results of solved classical queuing models. Lack of Markov chains features, unexponential patterns and service constraints, are the mentioned conditions. This paper represents an applied general algorithm for analysis and optimizing the queuing systems. The algorithm stages are described through a real case study. It is consisted of an almost completed non-Markov system with limited number of customers and capacities as well as lots of common exception of real queuing networks. Simulation is used for optimizing this system. So introduced stages over the following article include primary modeling, determining queuing system kinds, index defining, statistical analysis and goodness of fit test, validation of model and optimizing methods of system with simulation.
Keywords: Estimation, queuing system, simulation model, probability distribution, non-Markov chain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16201904 Using Cooperation Approaches at Different Levels of Artificial Bee Colony Method
Authors: Vahid Zeighami, Mohsen Ghasemi, Reza Akbari
Abstract:
In this work, a Multi-Level Artificial Bee Colony (called MLABC) for optimizing numerical test functions is presented. In MLABC, two species are used. The first species employs n colonies where each of them optimizes the complete solution vector. The cooperation between these colonies is carried out by exchanging information through a leader colony, which contains a set of elite bees. The second species uses a cooperative approach in which the complete solution vector is divided to k sub-vectors, and each of these sub-vectors is optimized by a colony. The cooperation between these colonies is carried out by compiling sub-vectors into the complete solution vector. Finally, the cooperation between two species is obtained by exchanging information. The proposed algorithm is tested on a set of well-known test functions. The results show that MLABC algorithm provides efficiency and robustness to solve numerical functions.
Keywords: Artificial bee colony, cooperative artificial bee colony, multilevel cooperation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23631903 A Framework for Data Mining Based Multi-Agent: An Application to Spatial Data
Authors: H. Baazaoui Zghal, S. Faiz, H. Ben Ghezala
Abstract:
Data mining is an extraordinarily demanding field referring to extraction of implicit knowledge and relationships, which are not explicitly stored in databases. A wide variety of methods of data mining have been introduced (classification, characterization, generalization...). Each one of these methods includes more than algorithm. A system of data mining implies different user categories,, which mean that the user-s behavior must be a component of the system. The problem at this level is to know which algorithm of which method to employ for an exploratory end, which one for a decisional end, and how can they collaborate and communicate. Agent paradigm presents a new way of conception and realizing of data mining system. The purpose is to combine different algorithms of data mining to prepare elements for decision-makers, benefiting from the possibilities offered by the multi-agent systems. In this paper the agent framework for data mining is introduced, and its overall architecture and functionality are presented. The validation is made on spatial data. Principal results will be presented.
Keywords: Databases, data mining, multi-agent, spatial datamart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20451902 Robot Vision Application based on Complex 3D Pose Computation
Authors: F. Rotaru, S. Bejinariu, C. D. Niţâ, R. Luca, I. Pâvâloi, C. Lazâr
Abstract:
The paper presents a technique suitable in robot vision applications where it is not possible to establish the object position from one view. Usually, one view pose calculation methods are based on the correspondence of image features established at a training step and exactly the same image features extracted at the execution step, for a different object pose. When such a correspondence is not feasible because of the lack of specific features a new method is proposed. In the first step the method computes from two views the 3D pose of feature points. Subsequently, using a registration algorithm, the set of 3D feature points extracted at the execution phase is aligned with the set of 3D feature points extracted at the training phase. The result is a Euclidean transform which have to be used by robot head for reorientation at execution step.Keywords: features correspondence, registration algorithm, robot vision, triangulation method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14721901 Performance Evaluation of Wavelet Based Coders on Brain MRI Volumetric Medical Datasets for Storage and Wireless Transmission
Authors: D. Dhouib, A. Naït-Ali, C. Olivier, M. S. Naceur
Abstract:
In this paper, we evaluate the performance of some wavelet based coding algorithms such as 3D QT-L, 3D SPIHT and JPEG2K. In the first step we achieve an objective comparison between three coders, namely 3D SPIHT, 3D QT-L and JPEG2K. For this purpose, eight MRI head scan test sets of 256 x 256x124 voxels have been used. Results show superior performance of 3D SPIHT algorithm, whereas 3D QT-L outperforms JPEG2K. The second step consists of evaluating the robustness of 3D SPIHT and JPEG2K coding algorithm over wireless transmission. Compressed dataset images are then transmitted over AWGN wireless channel or over Rayleigh wireless channel. Results show the superiority of JPEG2K over these two models. In fact, it has been deduced that JPEG2K is more robust regarding coding errors. Thus we may conclude the necessity of using corrector codes in order to protect the transmitted medical information.
Keywords: Image coding, medical imaging, wavelet basedcoder, wireless transmission.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19421900 Upgraded Rough Clustering and Outlier Detection Method on Yeast Dataset by Entropy Rough K-Means Method
Authors: P. Ashok, G. M. Kadhar Nawaz
Abstract:
Rough set theory is used to handle uncertainty and incomplete information by applying two accurate sets, Lower approximation and Upper approximation. In this paper, the rough clustering algorithms are improved by adopting the Similarity, Dissimilarity–Similarity and Entropy based initial centroids selection method on three different clustering algorithms namely Entropy based Rough K-Means (ERKM), Similarity based Rough K-Means (SRKM) and Dissimilarity-Similarity based Rough K-Means (DSRKM) were developed and executed by yeast dataset. The rough clustering algorithms are validated by cluster validity indexes namely Rand and Adjusted Rand indexes. An experimental result shows that the ERKM clustering algorithm perform effectively and delivers better results than other clustering methods. Outlier detection is an important task in data mining and very much different from the rest of the objects in the clusters. Entropy based Rough Outlier Factor (EROF) method is seemly to detect outlier effectively for yeast dataset. In rough K-Means method, by tuning the epsilon (ᶓ) value from 0.8 to 1.08 can detect outliers on boundary region and the RKM algorithm delivers better results, when choosing the value of epsilon (ᶓ) in the specified range. An experimental result shows that the EROF method on clustering algorithm performed very well and suitable for detecting outlier effectively for all datasets. Further, experimental readings show that the ERKM clustering method outperformed the other methods.
Keywords: Clustering, Entropy, Outlier, Rough K-Means, validity index.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14121899 Blind Impulse Response Identification of Frequency Radio Channels: Application to Bran A Channel
Authors: S. Safi, M. Frikel, M. M'Saad, A. Zeroual
Abstract:
This paper describes a blind algorithm for estimating a time varying and frequency selective fading channel. In order to identify blindly the impulse response of these channels, we have used Higher Order Statistics (HOS) to build our algorithm. In this paper, we have selected two theoretical frequency selective channels as the Proakis-s 'B' channel and the Macchi-s channel, and one practical frequency selective fading channel called Broadband Radio Access Network (BRAN A). The simulation results in noisy environment and for different data input channel, demonstrate that the proposed method could estimate the phase and magnitude of these channels blindly and without any information about the input, except that the input excitation is i.i.d (Identically and Independent Distributed) and non-Gaussian.
Keywords: Frequency response, system identification, higher order statistics, communication channels, phase estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18331898 A Flexible Flowshop Scheduling Problem with Machine Eligibility Constraint and Two Criteria Objective Function
Authors: Bita Tadayon, Nasser Salmasi
Abstract:
This research deals with a flexible flowshop scheduling problem with arrival and delivery of jobs in groups and processing them individually. Due to the special characteristics of each job, only a subset of machines in each stage is eligible to process that job. The objective function deals with minimization of sum of the completion time of groups on one hand and minimization of sum of the differences between completion time of jobs and delivery time of the group containing that job (waiting period) on the other hand. The problem can be stated as FFc / rj , Mj / irreg which has many applications in production and service industries. A mathematical model is proposed, the problem is proved to be NPcomplete, and an effective heuristic method is presented to schedule the jobs efficiently. This algorithm can then be used within the body of any metaheuristic algorithm for solving the problem.Keywords: flexible flowshop scheduling, group processing, machine eligibility constraint, mathematical modeling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18331897 Performance Analysis of MUSIC, Root-MUSIC and ESPRIT DOA Estimation Algorithm
Authors: N. P. Waweru, D. B. O. Konditi, P. K. Langat
Abstract:
Direction of Arrival estimation refers to defining a mathematical function called a pseudospectrum that gives an indication of the angle a signal is impinging on the antenna array. This estimation is an efficient method of improving the quality of service in a communication system by focusing the reception and transmission only in the estimated direction thereby increasing fidelity with a provision to suppress interferers. This improvement is largely dependent on the performance of the algorithm employed in the estimation. Many DOA algorithms exists amongst which are MUSIC, Root-MUSIC and ESPRIT. In this paper, performance of these three algorithms is analyzed in terms of complexity, accuracy as assessed and characterized by the CRLB and memory requirements in various environments and array sizes. It is found that the three algorithms are high resolution and dependent on the operating environment and the array size.
Keywords: Direction of Arrival, Autocorrelation matrix, Eigenvalue decomposition, MUSIC, ESPRIT, CRLB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 87621896 Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions
Authors: Anita S. Gangal, P. K. Kalra, D. S. Chauhan
Abstract:
The backpropagation algorithm in general employs quadratic error function. In fact, most of the problems that involve minimization employ the Quadratic error function. With alternative error functions the performance of the optimization scheme can be improved. The new error functions help in suppressing the ill-effects of the outliers and have shown good performance to noise. In this paper we have tried to evaluate and compare the relative performance of complex valued neural network using different error functions. During first simulation for complex XOR gate it is observed that some error functions like Absolute error, Cauchy error function can replace Quadratic error function. In the second simulation it is observed that for some error functions the performance of the complex valued neural network depends on the architecture of the network whereas with few other error functions convergence speed of the network is independent of architecture of the neural network.Keywords: Complex backpropagation algorithm, complex errorfunctions, complex valued neural network, split activation function.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2425