Search results for: Shortest Path Algorithm for Shopping.
2634 Feature Reduction of Nearest Neighbor Classifiers using Genetic Algorithm
Authors: M. Analoui, M. Fadavi Amiri
Abstract:
The design of a pattern classifier includes an attempt to select, among a set of possible features, a minimum subset of weakly correlated features that better discriminate the pattern classes. This is usually a difficult task in practice, normally requiring the application of heuristic knowledge about the specific problem domain. The selection and quality of the features representing each pattern have a considerable bearing on the success of subsequent pattern classification. Feature extraction is the process of deriving new features from the original features in order to reduce the cost of feature measurement, increase classifier efficiency, and allow higher classification accuracy. Many current feature extraction techniques involve linear transformations of the original pattern vectors to new vectors of lower dimensionality. While this is useful for data visualization and increasing classification efficiency, it does not necessarily reduce the number of features that must be measured since each new feature may be a linear combination of all of the features in the original pattern vector. In this paper a new approach is presented to feature extraction in which feature selection, feature extraction, and classifier training are performed simultaneously using a genetic algorithm. In this approach each feature value is first normalized by a linear equation, then scaled by the associated weight prior to training, testing, and classification. A knn classifier is used to evaluate each set of feature weights. The genetic algorithm optimizes a vector of feature weights, which are used to scale the individual features in the original pattern vectors in either a linear or a nonlinear fashion. By this approach, the number of features used in classifying can be finely reduced.Keywords: Feature reduction, genetic algorithm, pattern classification, nearest neighbor rule classifiers (k-NNR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17672633 Split-Pipe Design of Water Distribution Networks Using a Combination of Tabu Search and Genetic Algorithm
Authors: J. Tospornsampan, I. Kita, M. Ishii, Y. Kitamura
Abstract:
In this paper a combination approach of two heuristic-based algorithms: genetic algorithm and tabu search is proposed. It has been developed to obtain the least cost based on the split-pipe design of looped water distribution network. The proposed combination algorithm has been applied to solve the three well-known water distribution networks taken from the literature. The development of the combination of these two heuristic-based algorithms for optimization is aimed at enhancing their strengths and compensating their weaknesses. Tabu search is rather systematic and deterministic that uses adaptive memory in search process, while genetic algorithm is probabilistic and stochastic optimization technique in which the solution space is explored by generating candidate solutions. Split-pipe design may not be realistic in practice but in optimization purpose, optimal solutions are always achieved with split-pipe design. The solutions obtained in this study have proved that the least cost solutions obtained from the split-pipe design are always better than those obtained from the single pipe design. The results obtained from the combination approach show its ability and effectiveness to solve combinatorial optimization problems. The solutions obtained are very satisfactory and high quality in which the solutions of two networks are found to be the lowest-cost solutions yet presented in the literature. The concept of combination approach proposed in this study is expected to contribute some useful benefits in diverse problems.
Keywords: GAs, Heuristics, Looped network, Least-cost design, Pipe network, Optimization, TS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17882632 Performance Analysis of Bluetooth Low Energy Mesh Routing Algorithm in Case of Disaster Prediction
Authors: Asmir Gogic, Aljo Mujcic, Sandra Ibric, Nermin Suljanovic
Abstract:
Ubiquity of natural disasters during last few decades have risen serious questions towards the prediction of such events and human safety. Every disaster regardless its proportion has a precursor which is manifested as a disruption of some environmental parameter such as temperature, humidity, pressure, vibrations and etc. In order to anticipate and monitor those changes, in this paper we propose an overall system for disaster prediction and monitoring, based on wireless sensor network (WSN). Furthermore, we introduce a modified and simplified WSN routing protocol built on the top of the trickle routing algorithm. Routing algorithm was deployed using the bluetooth low energy protocol in order to achieve low power consumption. Performance of the WSN network was analyzed using a real life system implementation. Estimates of the WSN parameters such as battery life time, network size and packet delay are determined. Based on the performance of the WSN network, proposed system can be utilized for disaster monitoring and prediction due to its low power profile and mesh routing feature.Keywords: Bluetooth low energy, disaster prediction, mesh routing protocols, wireless sensor networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28572631 A Study on the Relation among Primary Care Professionals Serving the Disadvantaged Community, Socioeconomic Status, and Adverse Health Outcome
Authors: Chau-Kuang Chen, Juanita Buford, Colette Davis, Raisha Allen, John Hughes, Jr., James Tyus, Dexter Samuels
Abstract:
During the post-Civil War era, the city of Nashville, Tennessee, had the highest mortality rate in the United States. The elevated death and disease rates among former slaves were attributable to lack of quality healthcare. To address the paucity of healthcare services, Meharry Medical College, an institution with the mission of educating minority professionals and serving the underserved population, was established in 1876. Purpose: The social ecological framework and partial least squares (PLS) path modeling were used to quantify the impact of socioeconomic status and adverse health outcome on primary care professionals serving the disadvantaged community. Thus, the study results could demonstrate the accomplishment of the College’s mission of training primary care professionals to serve in underserved areas. Methods: Various statistical methods were used to analyze alumni data from 1975 – 2013. K-means cluster analysis was utilized to identify individual medical and dental graduates in the cluster groups of the practice communities (Disadvantaged or Non-disadvantaged Communities). Discriminant analysis was implemented to verify the classification accuracy of cluster analysis. The independent t-test was performed to detect the significant mean differences of respective clustering and criterion variables. Chi-square test was used to test if the proportions of primary care and non-primary care specialists are consistent with those of medical and dental graduates practicing in the designated community clusters. Finally, the PLS path model was constructed to explore the construct validity of analytic model by providing the magnitude effects of socioeconomic status and adverse health outcome on primary care professionals serving the disadvantaged community. Results: Approximately 83% (3,192/3,864) of Meharry Medical College’s medical and dental graduates from 1975 to 2013 were practicing in disadvantaged communities. Independent t-test confirmed the content validity of the cluster analysis model. Also, the PLS path modeling demonstrated that alumni served as primary care professionals in communities with significantly lower socioeconomic status and higher adverse health outcome (p < .001). The PLS path modeling exhibited the meaningful interrelation between primary care professionals practicing communities and surrounding environments (socioeconomic statues and adverse health outcome), which yielded model reliability, validity, and applicability. Conclusion: This study applied social ecological theory and analytic modeling approaches to assess the attainment of Meharry Medical College’s mission of training primary care professionals to serve in underserved areas, particularly in communities with low socioeconomic status and high rates of adverse health outcomes. In summary, the majority of medical and dental graduates from Meharry Medical College provided primary care services to disadvantaged communities with low socioeconomic status and high adverse health outcome, which demonstrated that Meharry Medical College has fulfilled its mission. The high reliability, validity, and applicability of this model imply that it could be replicated for comparable universities and colleges elsewhere.Keywords: Disadvantaged Community, K-means Cluster Analysis, PLS Path Modeling, Primary care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20362630 Wireless Transmission of Big Data Using Novel Secure Algorithm
Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha
Abstract:
This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.Keywords: Big data, cooperative jamming, energy balance, physical layer, two-hop transmission, wireless security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21802629 Comparative Study on Recent Integer DCTs
Authors: Sakol Udomsiri, Masahiro Iwahashi
Abstract:
This paper presents comparative study on recent integer DCTs and a new method to construct a low sensitive structure of integer DCT for colored input signals. The method refers to sensitivity of multiplier coefficients to finite word length as an indicator of how word length truncation effects on quality of output signal. The sensitivity is also theoretically evaluated as a function of auto-correlation and covariance matrix of input signal. The structure of integer DCT algorithm is optimized by combination of lower sensitive lifting structure types of IRT. It is evaluated by the sensitivity of multiplier coefficients to finite word length expression in a function of covariance matrix of input signal. Effectiveness of the optimum combination of IRT in integer DCT algorithm is confirmed by quality improvement comparing with existing case. As a result, the optimum combination of IRT in each integer DCT algorithm evidently improves output signal quality and it is still compatible with the existing one.Keywords: DCT, sensitivity, lossless, wordlength.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13812628 Digital Forensics for Electronic Commerce on the Web
Authors: Ryuya Uda
Abstract:
On existing online shopping on the web, SSL and password are usually used to achieve the secure trades. SSL shields communication from the third party who is not related with the trade, and indicates that the trader's web site is authenticated by one of the certification authority. Password certifies a customer as the same person who has visited the trader's web site before, and protects the customer's privacy such as what the customer has bought on the site. However, there is no forensics for the trades in those cased above. With existing methods, no one can prove what is ordered by customers, how many products are ordered and even whether customers have ordered or not. The reason is that the third party has to guess what were traded with logs that are held by traders and by customers. The logs can easily be created, deleted and forged since they are electronically stored. To enhance security with digital forensics for electronic commerce on the web, I indicate a secure method with cellular phones.Keywords: Cellular Phone, Digital Forensics, ElectronicCommerce, Information Security
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18392627 Optimal Design of Composite Patch for a Cracked Pipe by Utilizing Genetic Algorithm and Finite Element Method
Authors: Mahdi Fakoor, Seyed Mohammad Navid Ghoreishi
Abstract:
Composite patching is a common way for reinforcing the cracked pipes and cylinders. The effects of composite patch reinforcement on fracture parameters of a cracked pipe depend on a variety of parameters such as number of layers, angle, thickness, and material of each layer. Therefore, stacking sequence optimization of composite patch becomes crucial for the applications of cracked pipes. In this study, in order to obtain the optimal stacking sequence for a composite patch that has minimum weight and maximum resistance in propagation of cracks, a coupled Multi-Objective Genetic Algorithm (MOGA) and Finite Element Method (FEM) process is proposed. This optimization process has done for longitudinal and transverse semi-elliptical cracks and optimal stacking sequences and Pareto’s front for each kind of cracks are presented. The proposed algorithm is validated against collected results from the existing literature.
Keywords: Multi objective optimization, Pareto front, composite patch, cracked pipe.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9102626 An Ant Colony Optimization for Dynamic JobScheduling in Grid Environment
Authors: Siriluck Lorpunmanee, Mohd Noor Sap, Abdul Hanan Abdullah, Chai Chompoo-inwai
Abstract:
Grid computing is growing rapidly in the distributed heterogeneous systems for utilizing and sharing large-scale resources to solve complex scientific problems. Scheduling is the most recent topic used to achieve high performance in grid environments. It aims to find a suitable allocation of resources for each job. A typical problem which arises during this task is the decision of scheduling. It is about an effective utilization of processor to minimize tardiness time of a job, when it is being scheduled. This paper, therefore, addresses the problem by developing a general framework of grid scheduling using dynamic information and an ant colony optimization algorithm to improve the decision of scheduling. The performance of various dispatching rules such as First Come First Served (FCFS), Earliest Due Date (EDD), Earliest Release Date (ERD), and an Ant Colony Optimization (ACO) are compared. Moreover, the benefit of using an Ant Colony Optimization for performance improvement of the grid Scheduling is also discussed. It is found that the scheduling system using an Ant Colony Optimization algorithm can efficiently and effectively allocate jobs to proper resources.Keywords: Grid computing, Distributed heterogeneous system, Ant colony optimization algorithm, Grid scheduling, Dispatchingrules.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27062625 A Comparison of Exact and Heuristic Approaches to Capital Budgeting
Authors: Jindřiška Šedová, Miloš Šeda
Abstract:
This paper summarizes and compares approaches to solving the knapsack problem and its known application in capital budgeting. The first approach uses deterministic methods and can be applied to small-size tasks with a single constraint. We can also apply commercial software systems such as the GAMS modelling system. However, because of NP-completeness of the problem, more complex problem instances must be solved by means of heuristic techniques to achieve an approximation of the exact solution in a reasonable amount of time. We show the problem representation and parameter settings for a genetic algorithm framework.Keywords: Capital budgeting, knapsack problem, GAMS, heuristic method, genetic algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17392624 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement
Authors: Pogula Rakesh, T. Kishore Kumar
Abstract:
Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR) and SNR Loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.
Keywords: Adaptive filter, Adaptive Noise Canceller, Mean Squared Error, Noise reduction, NLMS, RLS, SNR, SNR Loss.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31832623 Self-Organizing Maps in Evolutionary Approachmeant for Dimensioning Routes to the Demand
Authors: J.-C. Créput, A. Koukam, A. Hajjam
Abstract:
We present a non standard Euclidean vehicle routing problem adding a level of clustering, and we revisit the use of self-organizing maps as a tool which naturally handles such problems. We present how they can be used as a main operator into an evolutionary algorithm to address two conflicting objectives of route length and distance from customers to bus stops minimization and to deal with capacity constraints. We apply the approach to a real-life case of combined clustering and vehicle routing for the transportation of the 780 employees of an enterprise. Basing upon a geographic information system we discuss the influence of road infrastructures on the solutions generated.Keywords: Evolutionary algorithm, self-organizing map, clustering and vehicle routing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13822622 Simulation Based VLSI Implementation of Fast Efficient Lossless Image Compression System Using Adjusted Binary Code & Golumb Rice Code
Authors: N. Muthukumaran, R. Ravi
Abstract:
The Simulation based VLSI Implementation of FELICS (Fast Efficient Lossless Image Compression System) Algorithm is proposed to provide the lossless image compression and is implemented in simulation oriented VLSI (Very Large Scale Integrated). To analysis the performance of Lossless image compression and to reduce the image without losing image quality and then implemented in VLSI based FELICS algorithm. In FELICS algorithm, which consists of simplified adjusted binary code for Image compression and these compression image is converted in pixel and then implemented in VLSI domain. This parameter is used to achieve high processing speed and minimize the area and power. The simplified adjusted binary code reduces the number of arithmetic operation and achieved high processing speed. The color difference preprocessing is also proposed to improve coding efficiency with simple arithmetic operation. Although VLSI based FELICS Algorithm provides effective solution for hardware architecture design for regular pipelining data flow parallelism with four stages. With two level parallelisms, consecutive pixels can be classified into even and odd samples and the individual hardware engine is dedicated for each one. This method can be further enhanced by multilevel parallelisms.
Keywords: Image compression, Pixel, Compression Ratio, Adjusted Binary code, Golumb Rice code, High Definition display, VLSI Implementation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20732621 Induction of Expressive Rules using the Binary Coding Method
Authors: Seyed R Mousavi
Abstract:
In most rule-induction algorithms, the only operator used against nominal attributes is the equality operator =. In this paper, we first propose the use of the inequality operator, ≠, in addition to the equality operator, to increase the expressiveness of induced rules. Then, we present a new method, Binary Coding, which can be used along with an arbitrary rule-induction algorithm to make use of the inequality operator without any need to change the algorithm. Experimental results suggest that the Binary Coding method is promising enough for further investigation, especially in cases where the minimum number of rules is desirable.
Keywords: Data mining, Inequality operator, Number of rules, Rule-induction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12562620 Template-Based Object Detection through Partial Shape Matching and Boundary Verification
Authors: Feng Ge, Tiecheng Liu, Song Wang, Joachim Stahl
Abstract:
This paper presents a novel template-based method to detect objects of interest from real images by shape matching. To locate a target object that has a similar shape to a given template boundary, the proposed method integrates three components: contour grouping, partial shape matching, and boundary verification. In the first component, low-level image features, including edges and corners, are grouped into a set of perceptually salient closed contours using an extended ratio-contour algorithm. In the second component, we develop a partial shape matching algorithm to identify the fractions of detected contours that partly match given template boundaries. Specifically, we represent template boundaries and detected contours using landmarks, and apply a greedy algorithm to search the matched landmark subsequences. For each matched fraction between a template and a detected contour, we estimate an affine transform that transforms the whole template into a hypothetic boundary. In the third component, we provide an efficient algorithm based on oriented edge lists to determine the target boundary from the hypothetic boundaries by checking each of them against image edges. We evaluate the proposed method on recognizing and localizing 12 template leaves in a data set of real images with clutter back-grounds, illumination variations, occlusions, and image noises. The experiments demonstrate the high performance of our proposed method1.Keywords: Object detection, shape matching, contour grouping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23042619 Comparison of Evolutionary Algorithms and their Hybrids Applied to MarioAI
Authors: Hidehiko Okada, Yuki Fujii
Abstract:
Researchers have been applying artificial/ computational intelligence (AI/CI) methods to computer games. In this research field, further researchesare required to compare AI/CI methods with respect to each game application. In thispaper, we report our experimental result on the comparison of evolution strategy, genetic algorithm and their hybrids, applied to evolving controller agents for MarioAI. GA revealed its advantage in our experiment, whereas the expected ability of ES in exploiting (fine-tuning) solutions was not clearly observed. The blend crossover operator and the mutation operator of GA might contribute well to explore the vast search space.
Keywords: Evolutionary algorithm, autonomous game controller agent, neuroevolutions, MarioAI
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17232618 Higher Order Statistics for Identification of Minimum Phase Channels
Authors: Mohammed Zidane, Said Safi, Mohamed Sabri, Ahmed Boumezzough
Abstract:
This paper describes a blind algorithm, which is compared with two another algorithms proposed in the literature, for estimating of the minimum phase channel parameters. In order to identify blindly the impulse response of these channels, we have used Higher Order Statistics (HOS) to build our algorithm. The simulation results in noisy environment, demonstrate that the proposed method could estimate the phase and magnitude with high accuracy of these channels blindly and without any information about the input, except that the input excitation is identically and independent distribute (i.i.d) and non-Gaussian.
Keywords: System Identification, Higher Order Statistics, Communication Channels.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16722617 Face Detection in Color Images using Color Features of Skin
Authors: Fattah Alizadeh, Saeed Nalousi, Chiman Savari
Abstract:
Because of increasing demands for security in today-s society and also due to paying much more attention to machine vision, biometric researches, pattern recognition and data retrieval in color images, face detection has got more application. In this article we present a scientific approach for modeling human skin color, and also offer an algorithm that tries to detect faces within color images by combination of skin features and determined threshold in the model. Proposed model is based on statistical data in different color spaces. Offered algorithm, using some specified color threshold, first, divides image pixels into two groups: skin pixel group and non-skin pixel group and then based on some geometric features of face decides which area belongs to face. Two main results that we received from this research are as follow: first, proposed model can be applied easily on different databases and color spaces to establish proper threshold. Second, our algorithm can adapt itself with runtime condition and its results demonstrate desirable progress in comparison with similar cases.Keywords: face detection, skin color modeling, color, colorfulimages, face recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23132616 Power System Security Constrained Economic Dispatch Using Real Coded Quantum Inspired Evolution Algorithm
Authors: A. K. Al-Othman, F. S. Al-Fares, K. M. EL-Nagger
Abstract:
This paper presents a new optimization technique based on quantum computing principles to solve a security constrained power system economic dispatch problem (SCED). The proposed technique is a population-based algorithm, which uses some quantum computing elements in coding and evolving groups of potential solutions to reach the optimum following a partially directed random approach. The SCED problem is formulated as a constrained optimization problem in a way that insures a secure-economic system operation. Real Coded Quantum-Inspired Evolution Algorithm (RQIEA) is then applied to solve the constrained optimization formulation. Simulation results of the proposed approach are compared with those reported in literature. The outcome is very encouraging and proves that RQIEA is very applicable for solving security constrained power system economic dispatch problem (SCED).Keywords: State Estimation, Fuzzy Linear Regression, FuzzyLinear State Estimator (FLSE) and Measurements Uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17152615 Considering the Effect of Semi-Rigid Connection in Steel Frame Structures for Progressive Collapse
Authors: Fooad Karimi Ghaleh Jough, Mohsen Soori
Abstract:
Today, the occurrence of progressive failure in structures has become a challenging issue, requiring the presentation of suitable solutions for structural resistance to this phenomenon. It is also necessary to evaluate the vulnerability of existing and under-construction buildings to progressive failure. The kind of lateral load-resisting system the building and its connections have is one of the most significant and influential variables in structural resistance to the risk of progressing failure. Using the "Alternative Path" approach suggested by the GSA2003 and UFC2013 recommendations, different configurations of semi-rigid connections against progressive failure are offered in this study. In order to do this, the Opensees program was used to model nine distinct semi-rigid connection configurations on a three-story Special Area of Conservation (SAC) structure, accounting for the impact of connection stiffness. Then, using nonlinear dynamic analysis, the effects of column removal were explored in two scenarios: corner column removal and middle column removal on the first level. Nonlinear static analysis results showed that when a column is removed, structures with semi-rigid connections experience larger displacements, which result in the construction of a plastic hinge. Furthermore, it was clear from the findings of the nonlinear static analysis that the possibility of progressive failure increased with the number of semi-rigid connections in the structure.
Keywords: Semi-rigid, nonlinear static analysis, progressive collapse, alternative path.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 962614 On Reversal and Transposition Medians
Authors: Martin Bader
Abstract:
During the last years, the genomes of more and more species have been sequenced, providing data for phylogenetic recon- struction based on genome rearrangement measures. A main task in all phylogenetic reconstruction algorithms is to solve the median of three problem. Although this problem is NP-hard even for the sim- plest distance measures, there are exact algorithms for the breakpoint median and the reversal median that are fast enough for practical use. In this paper, this approach is extended to the transposition median as well as to the weighted reversal and transposition median. Although there is no exact polynomial algorithm known even for the pairwise distances, we will show that it is in most cases possible to solve these problems exactly within reasonable time by using a branch and bound algorithm.Keywords: Comparative genomics, genome rearrangements, me-dian, reversals, transpositions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16882613 Self-evolving Artificial Immune System via Developing T and B Cell for Permutation Flow-shop Scheduling Problems
Authors: Pei-Chann Chang, Wei-Hsiu Huang, Ching-Jung Ting, Hwei-Wen Luo, Yu-Peng Yu
Abstract:
Artificial Immune System is applied as a Heuristic Algorithm for decades. Nevertheless, many of these applications took advantage of the benefit of this algorithm but seldom proposed approaches for enhancing the efficiency. In this paper, a Self-evolving Artificial Immune System is proposed via developing the T and B cell in Immune System and built a self-evolving mechanism for the complexities of different problems. In this research, it focuses on enhancing the efficiency of Clonal selection which is responsible for producing Affinities to resist the invading of Antigens. T and B cell are the main mechanisms for Clonal Selection to produce different combinations of Antibodies. Therefore, the development of T and B cell will influence the efficiency of Clonal Selection for searching better solution. Furthermore, for better cooperation of the two cells, a co-evolutional strategy is applied to coordinate for more effective productions of Antibodies. This work finally adopts Flow-shop scheduling instances in OR-library to validate the proposed algorithm.Keywords: Artificial Immune System, Clonal Selection, Flow-shop Scheduling Problems, Co-evolutional strategy
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17482612 A Novel Prostate Segmentation Algorithm in TRUS Images
Authors: Ali Rafiee, Ahad Salimi, Ali Reza Roosta
Abstract:
Prostate cancer is one of the most frequent cancers in men and is a major cause of mortality in the most of countries. In many diagnostic and treatment procedures for prostate disease accurate detection of prostate boundaries in transrectal ultrasound (TRUS) images is required. This is a challenging and difficult task due to weak prostate boundaries, speckle noise and the short range of gray levels. In this paper a novel method for automatic prostate segmentation in TRUS images is presented. This method involves preprocessing (edge preserving noise reduction and smoothing) and prostate segmentation. The speckle reduction has been achieved by using stick filter and top-hat transform has been implemented for smoothing. A feed forward neural network and local binary pattern together have been use to find a point inside prostate object. Finally the boundary of prostate is extracted by the inside point and an active contour algorithm. A numbers of experiments are conducted to validate this method and results showed that this new algorithm extracted the prostate boundary with MSE less than 4.6% relative to boundary provided manually by physicians.
Keywords: Prostate segmentation, stick filter, neural network, active contour.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19692611 Battery Grading Algorithm in 2nd-Life Repurposing Li-ion Battery System
Authors: Ya Lv, Benjamin Ong Wei Lin, Wanli Niu, Benjamin Seah Chin Tat
Abstract:
This article presents a methodology that improves reliability and cyclability of 2nd-life Li-ion battery system repurposed as energy storage system (ESS). Most of the 2nd-life retired battery systems in market have module/pack-level state of health (SOH) indicator, which is utilized for guiding appropriate depth of discharge (DOD) in the application of ESS. Due to the lack of cell-level SOH indication, the different degrading behaviors among various cells cannot be identified upon reaching retired status; in the end, considering end of life (EOL) loss and pack-level DOD, the repurposed ESS has to be oversized by > 1.5 times to complement the application requirement of reliability and cyclability. This proposed battery grading algorithm, using non-invasive methodology, is able to detect outlier cells based on historical voltage data and calculate cell-level historical maximum temperature data using semi-analytic methodology. In this way, the individual battery cell in the 2nd-life battery system can be graded in terms of SOH on basis of the historical voltage fluctuation and estimated historical maximum temperature variation. These grades will have corresponding DOD grades in the application of the repurposed ESS to enhance the system reliability and cyclability. In all, this introduced battery grading algorithm is non-invasive, compatible with all kinds of retired Li-ion battery systems which lack of cell-level SOH indication, as well as potentially being embedded into battery management software for preventive maintenance and real-time cyclability optimization.
Keywords: Battery grading algorithm, 2nd-life repurposing battery system, semi-analytic methodology, reliability and cyclability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8412610 DWT-SATS Based Detection of Image Region Cloning
Authors: Michael Zimba
Abstract:
A duplicated image region may be subjected to a number of attacks such as noise addition, compression, reflection, rotation, and scaling with the intention of either merely mating it to its targeted neighborhood or preventing its detection. In this paper, we present an effective and robust method of detecting duplicated regions inclusive of those affected by the various attacks. In order to reduce the dimension of the image, the proposed algorithm firstly performs discrete wavelet transform, DWT, of a suspicious image. However, unlike most existing copy move image forgery (CMIF) detection algorithms operating in the DWT domain which extract only the low frequency subband of the DWT of the suspicious image thereby leaving valuable information in the other three subbands, the proposed algorithm simultaneously extracts features from all the four subbands. The extracted features are not only more accurate representation of image regions but also robust to additive noise, JPEG compression, and affine transformation. Furthermore, principal component analysis-eigenvalue decomposition, PCA-EVD, is applied to reduce the dimension of the features. The extracted features are then sorted using the more computationally efficient Radix Sort algorithm. Finally, same affine transformation selection, SATS, a duplication verification method, is applied to detect duplicated regions. The proposed algorithm is not only fast but also more robust to attacks compared to the related CMIF detection algorithms. The experimental results show high detection rates.
Keywords: Affine Transformation, Discrete Wavelet Transform, Radix Sort, SATS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19102609 Calculation of Wave Function at the Origin (WFO) for Heavy Mesons by Numerical Solving of the Schrodinger Equation
Authors: M. Momeni Feyli
Abstract:
Many recent high energy physics calculations involving charm and beauty invoke wave function at the origin (WFO) for the meson bound state. Uncertainties of charm and beauty quark masses and different models for potentials governing these bound states require a simple numerical algorithm for evaluation of the WFO's for these bound states. We present a simple algorithm for this propose which provides WFO's with high precision compared with similar ones already obtained in the literature.Keywords: Mesons, Bound states, Schrodinger equation, Nonrelativistic quark model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15022608 Fast Database Indexing for Large Protein Sequence Collections Using Parallel N-Gram Transformation Algorithm
Authors: Jehad A. H. Hammad, Nur'Aini binti Abdul Rashid
Abstract:
With the rapid development in the field of life sciences and the flooding of genomic information, the need for faster and scalable searching methods has become urgent. One of the approaches that were investigated is indexing. The indexing methods have been categorized into three categories which are the lengthbased index algorithms, transformation-based algorithms and mixed techniques-based algorithms. In this research, we focused on the transformation based methods. We embedded the N-gram method into the transformation-based method to build an inverted index table. We then applied the parallel methods to speed up the index building time and to reduce the overall retrieval time when querying the genomic database. Our experiments show that the use of N-Gram transformation algorithm is an economical solution; it saves time and space too. The result shows that the size of the index is smaller than the size of the dataset when the size of N-Gram is 5 and 6. The parallel N-Gram transformation algorithm-s results indicate that the uses of parallel programming with large dataset are promising which can be improved further.Keywords: Biological sequence, Database index, N-gram indexing, Parallel computing, Sequence retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21362607 Fast Factored DCT-LMS Speech Enhancement for Performance Enhancement of Digital Hearing Aid
Authors: Sunitha. S.L., V. Udayashankara
Abstract:
Background noise is particularly damaging to speech intelligibility for people with hearing loss especially for sensorineural loss patients. Several investigations on speech intelligibility have demonstrated sensorineural loss patients need 5-15 dB higher SNR than the normal hearing subjects. This paper describes Discrete Cosine Transform Power Normalized Least Mean Square algorithm to improve the SNR and to reduce the convergence rate of the LMS for Sensory neural loss patients. Since it requires only real arithmetic, it establishes the faster convergence rate as compare to time domain LMS and also this transformation improves the eigenvalue distribution of the input autocorrelation matrix of the LMS filter. The DCT has good ortho-normal, separable, and energy compaction property. Although the DCT does not separate frequencies, it is a powerful signal decorrelator. It is a real valued function and thus can be effectively used in real-time operation. The advantages of DCT-LMS as compared to standard LMS algorithm are shown via SNR and eigenvalue ratio computations. . Exploiting the symmetry of the basis functions, the DCT transform matrix [AN] can be factored into a series of ±1 butterflies and rotation angles. This factorization results in one of the fastest DCT implementation. There are different ways to obtain factorizations. This work uses the fast factored DCT algorithm developed by Chen and company. The computer simulations results show superior convergence characteristics of the proposed algorithm by improving the SNR at least 10 dB for input SNR less than and equal to 0 dB, faster convergence speed and better time and frequency characteristics.Keywords: Hearing Impairment, DCT Adaptive filter, Sensorineural loss patients, Convergence rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21712606 Probability-Based Damage Detection of Structures Using Model Updating with Enhanced Ideal Gas Molecular Movement Algorithm
Authors: M. R. Ghasemi, R. Ghiasi, H. Varaee
Abstract:
Model updating method has received increasing attention in damage detection structures based on measured modal parameters. Therefore, a probability-based damage detection (PBDD) procedure based on a model updating procedure is presented in this paper, in which a one-stage model-based damage identification technique based on the dynamic features of a structure is investigated. The presented framework uses a finite element updating method with a Monte Carlo simulation that considers the uncertainty caused by measurement noise. Enhanced ideal gas molecular movement (EIGMM) is used as the main algorithm for model updating. Ideal gas molecular movement (IGMM) is a multiagent algorithm based on the ideal gas molecular movement. Ideal gas molecules disperse rapidly in different directions and cover all the space inside. This is embedded in the high speed of molecules, collisions between them and with the surrounding barriers. In IGMM algorithm to accomplish the optimal solutions, the initial population of gas molecules is randomly generated and the governing equations related to the velocity of gas molecules and collisions between those are utilized. In this paper, an enhanced version of IGMM, which removes unchanged variables after specified iterations, is developed. The proposed method is implemented on two numerical examples in the field of structural damage detection. The results show that the proposed method can perform well and competitive in PBDD of structures.Keywords: Enhanced ideal gas molecular movement, ideal gas molecular movement, model updating method, probability-based damage detection, uncertainty quantification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10752605 Codebook Generation for Vector Quantization on Orthogonal Polynomials based Transform Coding
Authors: R. Krishnamoorthi, N. Kannan
Abstract:
In this paper, a new algorithm for generating codebook is proposed for vector quantization (VQ) in image coding. The significant features of the training image vectors are extracted by using the proposed Orthogonal Polynomials based transformation. We propose to generate the codebook by partitioning these feature vectors into a binary tree. Each feature vector at a non-terminal node of the binary tree is directed to one of the two descendants by comparing a single feature associated with that node to a threshold. The binary tree codebook is used for encoding and decoding the feature vectors. In the decoding process the feature vectors are subjected to inverse transformation with the help of basis functions of the proposed Orthogonal Polynomials based transformation to get back the approximated input image training vectors. The results of the proposed coding are compared with the VQ using Discrete Cosine Transform (DCT) and Pairwise Nearest Neighbor (PNN) algorithm. The new algorithm results in a considerable reduction in computation time and provides better reconstructed picture quality.
Keywords: Orthogonal Polynomials, Image Coding, Vector Quantization, TSVQ, Binary Tree Classifier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2149