Search results for: genetic algorithms.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1989

Search results for: genetic algorithms.

489 Issues in Deploying Smart Antennas in Mobile Radio Networks

Authors: Rameshwar Kawitkar

Abstract:

With the exponentially increasing demand for wireless communications the capacity of current cellular systems will soon become incapable of handling the growing traffic. Since radio frequencies are diminishing natural resources, there seems to be a fundamental barrier to further capacity increase. The solution can be found in smart antenna systems. Smart or adaptive antenna arrays consist of an array of antenna elements with signal processing capability, that optimize the radiation and reception of a desired signal, dynamically. Smart antennas can place nulls in the direction of interferers via adaptive updating of weights linked to each antenna element. They thus cancel out most of the co-channel interference resulting in better quality of reception and lower dropped calls. Smart antennas can also track the user within a cell via direction of arrival algorithms. This implies that they are more advantageous than other antenna systems. This paper focuses on few issues about the smart antennas in mobile radio networks.

Keywords: Smart/Adaptive Antenna, Multipath fading, Beamforming, Radio propagation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2638
488 Mean Shift-based Preprocessing Methodology for Improved 3D Buildings Reconstruction

Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour

Abstract:

In this work, we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.

Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1952
487 Sequence Relationships Similarity of Swine Influenza a (H1N1) Virus

Authors: Patsaraporn Somboonsak, Mud-Armeen Munlin

Abstract:

In April 2009, a new variant of Influenza A virus subtype H1N1 emerged in Mexico and spread all over the world. The influenza has three subtypes in human (H1N1, H1N2 and H3N2) Types B and C influenza tend to be associated with local or regional epidemics. Preliminary genetic characterization of the influenza viruses has identified them as swine influenza A (H1N1) viruses. Nucleotide sequence analysis of the Haemagglutinin (HA) and Neuraminidase (NA) are similar to each other and the majority of their genes of swine influenza viruses, two genes coding for the neuraminidase (NA) and matrix (M) proteins are similar to corresponding genes of swine influenza. Sequence similarity between the 2009 A (H1N1) virus and its nearest relatives indicates that its gene segments have been circulating undetected for an extended period. Nucleic acid sequence Maximum Likelihood (MCL) and DNA Empirical base frequencies, Phylogenetic relationship amongst the HA genes of H1N1 virus isolated in Genbank having high nucleotide sequence homology. In this paper we used 16 HA nucleotide sequences from NCBI for computing sequence relationships similarity of swine influenza A virus using the following method MCL the result is 28%, 36.64% for Optimal tree with the sum of branch length, 35.62% for Interior branch phylogeny Neighber – Join Tree, 1.85% for the overall transition/transversion, and 8.28% for Overall mean distance.

Keywords: Sequence DNA, Relationship of swine, Swineinfluenza, Sequence Similarity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075
486 Smart Grid Simulator

Authors: Andrei Ursachi, Dorin Bordeasu

Abstract:

The Smart Grid Simulator is a computer software based on advance algorithms which has as the main purpose to lower the energy bill in the most optimized price efficient way as possible for private households, companies or energy providers. It combines the energy provided by a number of solar modules and wind turbines with the consumption of one household or a cluster of nearby households and information regarding weather conditions and energy prices in order to predict the amount of energy that can be produced by renewable energy sources and the amount of energy that will be bought from the distributor for the following day. The user of the system will not only be able to minimize his expenditures on energy factures, but also he will be informed about his hourly consumption, electricity prices fluctuation and money spent for energy bought as well as how much money he saved each day and since he installed the system. The paper outlines the algorithm that supports the Smart Grid Simulator idea and presents preliminary test results that supports the discussion and implementation of the system.

Keywords: Applied Science, Renewable energy sources, Smart Grid, Sustainable energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3046
485 Computer Aided Detection on Mammography

Authors: Giovanni Luca Masala

Abstract:

A typical definition of the Computer Aided Diagnosis (CAD), found in literature, can be: A diagnosis made by a radiologist using the output of a computerized scheme for automated image analysis as a diagnostic aid. Often it is possible to find the expression Computer Aided Detection (CAD or CADe): this definition emphasizes the intent of CAD to support rather than substitute the human observer in the analysis of radiographic images. In this article we will illustrate the application of CAD systems and the aim of these definitions. Commercially available CAD systems use computerized algorithms for identifying suspicious regions of interest. In this paper are described the general CAD systems as an expert system constituted of the following components: segmentation / detection, feature extraction, and classification / decision making. As example, in this work is shown the realization of a Computer- Aided Detection system that is able to assist the radiologist in identifying types of mammary tumor lesions. Furthermore this prototype of station uses a GRID configuration to work on a large distributed database of digitized mammographic images.

Keywords: Computer Aided Detection, Computer Aided Diagnosis, mammography, GRID.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1885
484 JConqurr - A Multi-Core Programming Toolkit for Java

Authors: G.A.C.P. Ganegoda, D.M.A. Samaranayake, L.S. Bandara, K.A.D.N.K. Wimalawarne

Abstract:

With the popularity of the multi-core and many-core architectures there is a great requirement for software frameworks which can support parallel programming methodologies. In this paper we introduce an Eclipse toolkit, JConqurr which is easy to use and provides robust support for flexible parallel progrmaming. JConqurr is a multi-core and many-core programming toolkit for Java which is capable of providing support for common parallel programming patterns which include task, data, divide and conquer and pipeline parallelism. The toolkit uses an annotation and a directive mechanism to convert the sequential code into parallel code. In addition to that we have proposed a novel mechanism to achieve the parallelism using graphical processing units (GPU). Experiments with common parallelizable algorithms have shown that our toolkit can be easily and efficiently used to convert sequential code to parallel code and significant performance gains can be achieved.

Keywords: Multi-core, parallel programming patterns, GPU, Java, Eclipse plugin, toolkit,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075
483 Analysis of Electrocardiograph (ECG) Signal for the Detection of Abnormalities Using MATLAB

Authors: Durgesh Kumar Ojha, Monica Subashini

Abstract:

The proposed method is to study and analyze Electrocardiograph (ECG) waveform to detect abnormalities present with reference to P, Q, R and S peaks. The first phase includes the acquisition of real time ECG data. In the next phase, generation of signals followed by pre-processing. Thirdly, the procured ECG signal is subjected to feature extraction. The extracted features detect abnormal peaks present in the waveform Thus the normal and abnormal ECG signal could be differentiated based on the features extracted. The work is implemented in the most familiar multipurpose tool, MATLAB. This software efficiently uses algorithms and techniques for detection of any abnormalities present in the ECG signal. Proper utilization of MATLAB functions (both built-in and user defined) can lead us to work with ECG signals for processing and analysis in real time applications. The simulation would help in improving the accuracy and the hardware could be built conveniently.

Keywords: ECG Waveform, Peak Detection, Arrhythmia, Matlab.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11941
482 Modelling of Multi-Agent Systems for the Scheduling of Multi-EV Charging from Power Limited Sources

Authors: Manan’Iarivo Rasolonjanahary, Chris Bingham, Nigel Schofield, Masoud Bazargan

Abstract:

This paper presents the research and application of model predictive scheduled charging of electric vehicles (EV) subject to limited available power resource. To focus on algorithm and operational characteristics, the EV interface to the source is modelled as a battery state equation during the charging operation. The researched methods allow for the priority scheduling of EV charging in a multi-vehicle regime and when subject to limited source power availability. Priority attribution for each connected EV is described. The validity of the developed methodology is shown through the simulation of different scenarios of charging operation of multiple connected EVs including non-scheduled and scheduled operation with various numbers of vehicles. Performance of the developed algorithms is also reported with the recommendation of the choice of suitable parameters.

Keywords: Model predictive control, non-scheduled, power limited sources, scheduled and stop-start battery charging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 458
481 Performance Analysis of a Series of Adaptive Filters in Non-Stationary Environment for Noise Cancelling Setup

Authors: Anam Rafique, Syed Sohail Ahmed

Abstract:

One of the essential components of much of DSP application is noise cancellation. Changes in real time signals are quite rapid and swift. In noise cancellation, a reference signal which is an approximation of noise signal (that corrupts the original information signal) is obtained and then subtracted from the noise bearing signal to obtain a noise free signal. This approximation of noise signal is obtained through adaptive filters which are self adjusting. As the changes in real time signals are abrupt, this needs adaptive algorithm that converges fast and is stable. Least mean square (LMS) and normalized LMS (NLMS) are two widely used algorithms because of their plainness in calculations and implementation. But their convergence rates are small. Adaptive averaging filters (AFA) are also used because they have high convergence, but they are less stable. This paper provides the comparative study of LMS and Normalized NLMS, AFA and new enhanced average adaptive (Average NLMS-ANLMS) filters for noise cancelling application using speech signals.

Keywords: AFA, ANLMS, LMS, NLMS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1901
480 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry

Authors: A. O. Salami

Abstract:

The transportation problems are primarily concerned with the optimal way in which products produced at different plants (supply origins) are transported to a number of warehouses or customers (demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transportation cost in order to maximum profit. Data were sourced from the records of the Distribution Department of 7-Up Bottling Company Plc., Ilorin, Kwara State, Nigeria. The data were computed and analyzed using the three methods of solving transportation problem. The result shows that the three methods produced the same total transportation costs amounting to N1, 358, 019, implying that any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost. 

Keywords: Allocation problem, Cost Minimization, Distribution system, Resources utilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8764
479 Finite Element Modeling of Heat and Moisture Transfer in Porous Material

Authors: V. D. Thi, M. Li, M. Khelifa, M. El Ganaoui, Y. Rogaume

Abstract:

This paper presents a two-dimensional model to study the heat and moisture transfer through porous building materials. Dynamic and static coupled models of heat and moisture transfer in porous material under low temperature are presented and the coupled models together with variable initial and boundary conditions have been considered in an analytical way and using the finite element method. The resulting coupled model is converted to two nonlinear partial differential equations, which is then numerically solved by an implicit iterative scheme. The numerical results of temperature and moisture potential changes are compared with the experimental measurements available in the literature. Predicted results demonstrate validation of the theoretical model and effectiveness of the developed numerical algorithms. It is expected to provide useful information for the porous building material design based on heat and moisture transfer model.

Keywords: Finite element method, heat transfer, moisture transfer, porous materials, wood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1227
478 A Meta-Heuristic Algorithm for Set Covering Problem Based on Gravity

Authors: S. Raja Balachandar, K. Kannan

Abstract:

A new Meta heuristic approach called "Randomized gravitational emulation search algorithm (RGES)" for solving large size set covering problems has been designed. This algorithm is found upon introducing randomization concept along with the two of the four primary parameters -velocity- and -gravity- in physics. A new heuristic operator is introduced in the domain of RGES to maintain feasibility specifically for the set covering problem to yield best solutions. The performance of this algorithm has been evaluated on a large set of benchmark problems from OR-library. Computational results showed that the randomized gravitational emulation search algorithm - based heuristic is capable of producing high quality solutions. The performance of this heuristic when compared with other existing heuristic algorithms is found to be excellent in terms of solution quality.

Keywords: Set covering problem, velocity, gravitational force, Newton's law, meta heuristic, combinatorial optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2190
477 Analysis of a Population of Diabetic Patients Databases with Classifiers

Authors: Murat Koklu, Yavuz Unal

Abstract:

Data mining can be called as a technique to extract information from data. It is the process of obtaining hidden information and then turning it into qualified knowledge by statistical and artificial intelligence technique. One of its application areas is medical area to form decision support systems for diagnosis just by inventing meaningful information from given medical data. In this study a decision support system for diagnosis of illness that make use of data mining and three different artificial intelligence classifier algorithms namely Multilayer Perceptron, Naive Bayes Classifier and J.48. Pima Indian dataset of UCI Machine Learning Repository was used. This dataset includes urinary and blood test results of 768 patients. These test results consist of 8 different feature vectors. Obtained classifying results were compared with the previous studies. The suggestions for future studies were presented.

Keywords: Artificial Intelligence, Classifiers, Data Mining, Diabetic Patients.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5390
476 Cloud Computing Cryptography "State-of-the-Art"

Authors: Omer K. Jasim, Safia Abbas, El-Sayed M. El-Horbaty, Abdel-Badeeh M. Salem

Abstract:

Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing environments provide, there are many questions still coming up on how to gain a trusted environment that protect data and applications in clouds from hackers and intruders. This paper surveys the “keys generation and management” mechanism and encryption/decryption algorithms used in cloud computing environments, we proposed new security architecture for cloud computing environment that considers the various security gaps as much as possible. A new cryptographic environment that implements quantum mechanics in order to gain more trusted with less computation cloud communications is given.

Keywords: Cloud Computing, Cloud Encryption Model, Quantum Key Distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4055
475 Efficient Callus Induction and Plant Regeneration from Mature Embryo Culture of Barley (Hordeum vulgare L.) Genotypes

Authors: Münüre Tanur Erkoyuncu, Mustafa Yorgancılar

Abstract:

Crop improvement through genetic engineering depends on effective and reproducible plant regeneration systems. Immature embryos are the most widely used explant source for in vitro regeneration in barley (Hordeum vulgare L.). However, immature embryos require the continuous growth of donor plants and the suitable stage for their culture is also certainly limited. On the other hand, mature embryos can be procured and stored easily; they can be studied throughout the year. In this study, an effective callus induction and plant regeneration were aimed to develop from mature embryos of different barley genotypes. The effect of medium (MS1 and MS2), auxin type (2,4-D, dicamba, picloram and 2,4,5-T) and concentrations (2, 4, 6 mg/l) on callus formation and effect of cytokinin type (TDZ, BAP) and concentrations (0.2, 0.5, 1.0 mg/l) on green plant regeneration were evaluated in mature embryo culture of barley. Callus and shoot formation was successful for all genotypes. By depending on genotype, MS1 is the best medium, 4 mg/l dicamba is the best growth regulator in the callus induction and MS1 is the best medium, 1 mg/l BAP is the best growth regulator in the shoot formation were determined.

Keywords: Barley, callus, embryo culture, mature embryo.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1346
474 A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

Authors: Amir-Massoud Bidgoli, Mehdi Naseri Parsa

Abstract:

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Keywords: feature selection, resampling, reliable features, Consistency Subset Evaluation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2547
473 Incremental Mining of Shocking Association Patterns

Authors: Eiad Yafi, Ahmed Sultan Al-Hegami, M. A. Alam, Ranjit Biswas

Abstract:

Association rules are an important problem in data mining. Massively increasing volume of data in real life databases has motivated researchers to design novel and incremental algorithms for association rules mining. In this paper, we propose an incremental association rules mining algorithm that integrates shocking interestingness criterion during the process of building the model. A new interesting measure called shocking measure is introduced. One of the main features of the proposed approach is to capture the user background knowledge, which is monotonically augmented. The incremental model that reflects the changing data and the user beliefs is attractive in order to make the over all KDD process more effective and efficient. We implemented the proposed approach and experiment it with some public datasets and found the results quite promising.

Keywords: Knowledge discovery in databases (KDD), Data mining, Incremental Association rules, Domain knowledge, Interestingness, Shocking rules (SHR).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1839
472 Optimal Route Policy in Air Traffic Control with Competing Airlines

Authors: Siliang Wang, Minghui Wang

Abstract:

This work proposes a novel market-based air traffic flow control model considering competitive airlines in air traffic network. In the flow model, an agent based framework for resources (link/time pair) pricing is described. Resource agent and auctioneer for groups of resources are also introduced to simulate the flow management in Air Traffic Control (ATC). Secondly, the distributed group pricing algorithm is introduced, which efficiently reflect the competitive nature of the airline industry. Resources in the system are grouped according to the degree of interaction, and each auctioneer adjust s the price of one group of resources respectively until the excess demand of resources becomes zero when the demand and supply of resources of the system changes. Numerical simulation results show the feasibility of solving the air traffic flow control problem using market mechanism and pricing algorithms on the air traffic network.

Keywords: Air traffic control, Nonlinear programming, Marketmechanism, Route policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
471 Simultaneous Clustering and Feature Selection Method for Gene Expression Data

Authors: T. Chandrasekhar, K. Thangavel, E. N. Sathishkumar

Abstract:

Microarrays are made it possible to simultaneously monitor the expression profiles of thousands of genes under various experimental conditions. It is used to identify the co-expressed genes in specific cells or tissues that are actively used to make proteins. This method is used to analysis the gene expression, an important task in bioinformatics research. Cluster analysis of gene expression data has proved to be a useful tool for identifying co-expressed genes, biologically relevant groupings of genes and samples. In this work K-Means algorithms has been applied for clustering of Gene Expression Data. Further, rough set based Quick reduct algorithm has been applied for each cluster in order to select the most similar genes having high correlation. Then the ACV measure is used to evaluate the refined clusters and classification is used to evaluate the proposed method. They could identify compact clusters with feature selection method used to genes are selected.

Keywords: Clustering, Feature selection, Gene expression data, Quick reduct.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1930
470 Tracking Objects in Color Image Sequences: Application to Football Images

Authors: Mourad Moussa, Ali Douik, Hassani Messaoud

Abstract:

In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.

Keywords: Image segmentation, objects tracking, Parzen window, singular value decomposition, target recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1938
469 Construct Pairwise Test Suites Based on the Bak-Sneppen Model of Biological Evolution

Authors: Jianjun Yuan, Changjun Jiang

Abstract:

Pairwise testing, which requires that every combination of valid values of each pair of system factors be covered by at lease one test case, plays an important role in software testing since many faults are caused by unexpected 2-way interactions among system factors. Although meta-heuristic strategies like simulated annealing can generally discover smaller pairwise test suite, they may cost more time to perform search, compared with greedy algorithms. We propose a new method, improved Extremal Optimization (EO) based on the Bak-Sneppen (BS) model of biological evolution, for constructing pairwise test suites and define fitness function according to the requirement of improved EO. Experimental results show that improved EO gives similar size of resulting pairwise test suite and yields an 85% reduction in solution time over SA.

Keywords: Covering Arrays, Extremal Optimization, Simulated Annealing, Software Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739
468 Medical Image Edge Detection Based on Neuro-Fuzzy Approach

Authors: J. Mehena, M. C. Adhikary

Abstract:

Edge detection is one of the most important tasks in image processing. Medical image edge detection plays an important role in segmentation and object recognition of the human organs. It refers to the process of identifying and locating sharp discontinuities in medical images. In this paper, a neuro-fuzzy based approach is introduced to detect the edges for noisy medical images. This approach uses desired number of neuro-fuzzy subdetectors with a postprocessor for detecting the edges of medical images. The internal parameters of the approach are optimized by training pattern using artificial images. The performance of the approach is evaluated on different medical images and compared with popular edge detection algorithm. From the experimental results, it is clear that this approach has better performance than those of other competing edge detection algorithms for noisy medical images.

Keywords: Edge detection, neuro-fuzzy, image segmentation, artificial image, object recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1236
467 Applying Spanning Tree Graph Theory for Automatic Database Normalization

Authors: Chetneti Srisa-an

Abstract:

In Knowledge and Data Engineering field, relational database is the best repository to store data in a real world. It has been using around the world more than eight decades. Normalization is the most important process for the analysis and design of relational databases. It aims at creating a set of relational tables with minimum data redundancy that preserve consistency and facilitate correct insertion, deletion, and modification. Normalization is a major task in the design of relational databases. Despite its importance, very few algorithms have been developed to be used in the design of commercial automatic normalization tools. It is also rare technique to do it automatically rather manually. Moreover, for a large and complex database as of now, it make even harder to do it manually. This paper presents a new complete automated relational database normalization method. It produces the directed graph and spanning tree, first. It then proceeds with generating the 2NF, 3NF and also BCNF normal forms. The benefit of this new algorithm is that it can cope with a large set of complex function dependencies.

Keywords: Relational Database, Functional Dependency, Automatic Normalization, Primary Key, Spanning tree.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2831
466 A Context-Aware Supplier Selection Model

Authors: Mohammadreza Razzazi, Maryam Bayat

Abstract:

Selection of the best possible set of suppliers has a significant impact on the overall profitability and success of any business. For this reason, it is usually necessary to optimize all business processes and to make use of cost-effective alternatives for additional savings. This paper proposes a new efficient context-aware supplier selection model that takes into account possible changes of the environment while significantly reducing selection costs. The proposed model is based on data clustering techniques while inspiring certain principles of online algorithms for an optimally selection of suppliers. Unlike common selection models which re-run the selection algorithm from the scratch-line for any decision-making sub-period on the whole environment, our model considers the changes only and superimposes it to the previously defined best set of suppliers to obtain a new best set of suppliers. Therefore, any recomputation of unchanged elements of the environment is avoided and selection costs are consequently reduced significantly. A numerical evaluation confirms applicability of this model and proves that it is a more optimal solution compared with common static selection models in this field.

Keywords: Supplier Selection, Context-Awareness, OnlineAlgorithms, Data Clustering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1782
465 Supervisory Fuzzy Learning Control for Underwater Target Tracking

Authors: C.Kia, M.R.Arshad, A.H.Adom, P.A.Wilson

Abstract:

This paper presents recent work on the improvement of the robotics vision based control strategy for underwater pipeline tracking system. The study focuses on developing image processing algorithms and a fuzzy inference system for the analysis of the terrain. The main goal is to implement the supervisory fuzzy learning control technique to reduce the errors on navigation decision due to the pipeline occlusion problem. The system developed is capable of interpreting underwater images containing occluded pipeline, seabed and other unwanted noise. The algorithm proposed in previous work does not explore the cooperation between fuzzy controllers, knowledge and learnt data to improve the outputs for underwater pipeline tracking. Computer simulations and prototype simulations demonstrate the effectiveness of this approach. The system accuracy level has also been discussed.

Keywords: Fuzzy logic, Underwater target tracking, Autonomous underwater vehicles, Artificial intelligence, Simulations, Robot navigation, Vision system.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1844
464 Segmentation of Gray Scale Images of Dropwise Condensation on Textured Surfaces

Authors: Helene Martin, Solmaz Boroomandi Barati, Jean-Charles Pinoli, Stephane Valette, Yann Gavet

Abstract:

In the present work we developed an image processing algorithm to measure water droplets characteristics during dropwise condensation on pillared surfaces. The main problem in this process is the similarity between shape and size of water droplets and the pillars. The developed method divides droplets into four main groups based on their size and applies the corresponding algorithm to segment each group. These algorithms generate binary images of droplets based on both their geometrical and intensity properties. The information related to droplets evolution during time including mean radius and drops number per unit area are then extracted from the binary images. The developed image processing algorithm is verified using manual detection and applied to two different sets of images corresponding to two kinds of pillared surfaces.

Keywords: Dropwise condensation, textured surface, image processing, watershed.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 648
463 Spike Sorting Method Using Exponential Autoregressive Modeling of Action Potentials

Authors: Sajjad Farashi

Abstract:

Neurons in the nervous system communicate with each other by producing electrical signals called spikes. To investigate the physiological function of nervous system it is essential to study the activity of neurons by detecting and sorting spikes in the recorded signal. In this paper a method is proposed for considering the spike sorting problem which is based on the nonlinear modeling of spikes using exponential autoregressive model. The genetic algorithm is utilized for model parameter estimation. In this regard some selected model coefficients are used as features for sorting purposes. For optimal selection of model coefficients, self-organizing feature map is used. The results show that modeling of spikes with nonlinear autoregressive model outperforms its linear counterpart. Also the extracted features based on the coefficients of exponential autoregressive model are better than wavelet based extracted features and get more compact and well-separated clusters. In the case of spikes different in small-scale structures where principal component analysis fails to get separated clouds in the feature space, the proposed method can obtain well-separated cluster which removes the necessity of applying complex classifiers.

Keywords: Exponential autoregressive model, Neural data, spike sorting, time series modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
462 Study of Sickle Cell Syndromes in the Population of the Region of Batna

Authors: K .Belhadi, H. Bousselsela, M. Yahia, A. Zidani, S. Benbia

Abstract:

Sickle cell anemia is a recessive genetic disease caused by the presence in the red blood cell, of abnormal hemoglobin called hemoglobin S. It results from the replacement in the beta chain of the acid glutamic acid by valin at position 6. Topics may be homozygous (SS) or heterozygous (AS) most often asymptomatic. Other mutations result in compound heterozygous: - Synthesis of hemoglobin C mutation in the sixth leucin codon (heterozygous SC); - ß-thalassemia (heterozygous S-ß thalassemia). SS homozygous, heterozygous SC and S- ß -thalassemia are grouped under the major sickle cell syndromes. To make a laboratory diagnosis of hemoglobinopathies in a portion of the population in region of Batna, our study was conducted on 115 patients with suspected sickle cell anemia, all cases have benefited from hematological tests as blood count (count RBC, calculated erythrocyte indices, MCV and MCHC, measuring the hemoglobin concentration) and a biochemical test in this case electrophoresis CAPILLARYS HEMOGLOBIN (E). The results showed: 27 cases of sickle cell anemia were found on 115 suspected cases, 73,03% homozygous sickle cell disease and 59,25% sickle cell trait. Finally, the double heterozygous S/C, represent the incidence rate of 3, 70%.

Keywords: Hemoglobin, sickle cell syndromes, laboratory diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
461 Optimum Design of Trusses by Cuckoo Search

Authors: M. Saravanan, J. Raja Murugadoss, V. Jayanthi

Abstract:

Optimal design of structure has a main role in reduction of material usage which leads to deduction in the final cost of construction projects. Evolutionary approaches are found to be more successful techniques for solving size and shape structural optimization problem since it uses a stochastic random search instead of a gradient search. By reviewing the recent literature works the problem found was the optimization of weight. A new meta-heuristic algorithm called as Cuckoo Search (CS) Algorithm has used for the optimization of the total weight of the truss structures. This paper has used set of 10 bars and 25 bars trusses for the testing purpose. The main objective of this work is to reduce the number of iterations, weight and the total time consumption. In order to demonstrate the effectiveness of the present method, minimum weight design of truss structures is performed and the results of the CS are compared with other algorithms.

Keywords: Cuckoo search algorithm, levy’s flight, meta-heuristic, optimal weight.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2077
460 Localisation of Anatomical Soft Tissue Landmarks of the Head in CT Images

Authors: M. Ovinis, D. Kerr, K. Bouazza-Marouf, M. Vloeberghs

Abstract:

In this paper, algorithms for the automatic localisation of two anatomical soft tissue landmarks of the head the medial canthus (inner corner of the eye) and the tragus (a small, pointed, cartilaginous flap of the ear), in CT images are describet. These landmarks are to be used as a basis for an automated image-to-patient registration system we are developing. The landmarks are localised on a surface model extracted from CT images, based on surface curvature and a rule based system that incorporates prior knowledge of the landmark characteristics. The approach was tested on a dataset of near isotropic CT images of 95 patients. The position of the automatically localised landmarks was compared to the position of the manually localised landmarks. The average difference was 1.5 mm and 0.8 mm for the medial canthus and tragus, with a maximum difference of 4.5 mm and 2.6 mm respectively.The medial canthus and tragus can be automatically localised in CT images, with performance comparable to manual localisation

Keywords: Anatomical soft tissue landmarks, automatic localisation, Computed Tomography (CT)

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1808