Search results for: classification problem
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4499

Search results for: classification problem

2939 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows

Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid

Abstract:

Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.

Keywords: Optimal control, ensemble Kalman Filter, topography reconstruction, data assimilation, shallow water equations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 679
2938 Improved Skin Detection Using Colour Space and Texture

Authors: Medjram Sofiane, Babahenini Mohamed Chaouki, Mohamed Benali Yamina

Abstract:

Skin detection is an important task for computer vision systems. A good method of skin detection means a good and successful result of the system. The colour is a good descriptor for image segmentation and classification; it allows detecting skin colour in the images. The lighting changes and the objects that have a colour similar than skin colour make the operation of skin detection difficult. In this paper, we proposed a method using the YCbCr colour space for skin detection and lighting effects elimination, then we use the information of texture to eliminate the false regions detected by the YCbCr skin model.

Keywords: Skin detection, YCbCr, GLCM, Texture, Human skin.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2449
2937 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space

Authors: Sanaa Chafik, ImaneDaoudi, Mounim A. El Yacoubi, Hamid El Ouardi

Abstract:

Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.

Keywords: Approximate Nearest Neighbor Search, Content based image retrieval (CBIR), Curse of dimensionality, Locality sensitive hashing, Multidimensional indexing, Scalability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2577
2936 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: Big data, bus headway prediction, machine learning, public transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
2935 Reduction of Multiple User Interference for Optical CDMA Systems Using Successive Interference Cancellation Scheme

Authors: Tawfig Eltaif, Hesham A. Bakarman, N. Alsowaidi, M. R. Mokhtar, Malek Harbawi

Abstract:

Multiple User Interference (MUI) considers the primary problem in Optical Code-Division Multiple Access (OCDMA), which resulting from the overlapping among the users. In this article we aim to mitigate this problem by studying an interference cancellation scheme called successive interference cancellation (SIC) scheme. This scheme will be tested on two different detection schemes, spectral amplitude coding (SAC) and direct detection systems (DS), using partial modified prime (PMP) as the signature codes. It was found that SIC scheme based on both SAC and DS methods had a potential to suppress the intensity noise, that is to say it can mitigate MUI noise. Furthermore, SIC/DS scheme showed much lower bit error rate (BER) performance relative to SIC/SAC scheme for different magnitude of effective power. Hence, many more users can be supported by SIC/DS receiver system.

Keywords: Multiple User Interference (MUI), Optical Code-Division Multiple Access (OCDMA), Partial Modified Prime Code (PMP), Spectral Amplitude Coding (SAC), Successive Interference Cancellation (SIC).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
2934 Developing New Processes and Optimizing Performance Using Response Surface Methodology

Authors: S. Raissi

Abstract:

Response surface methodology (RSM) is a very efficient tool to provide a good practical insight into developing new process and optimizing them. This methodology could help engineers to raise a mathematical model to represent the behavior of system as a convincing function of process parameters. Through this paper the sequential nature of the RSM surveyed for process engineers and its relationship to design of experiments (DOE), regression analysis and robust design reviewed. The proposed four-step procedure in two different phases could help system analyst to resolve the parameter design problem involving responses. In order to check accuracy of the designed model, residual analysis and prediction error sum of squares (PRESS) described. It is believed that the proposed procedure in this study can resolve a complex parameter design problem with one or more responses. It can be applied to those areas where there are large data sets and a number of responses are to be optimized simultaneously. In addition, the proposed procedure is relatively simple and can be implemented easily by using ready-made standard statistical packages.

Keywords: Response Surface Methodology (RSM), Design of Experiments (DOE), Process modeling, Process setting, Process optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1837
2933 Satellite Rainfall Prediction Techniques - A State of the Art Review

Authors: S. Sarumathi, N. Shanthi, S. Vidhya

Abstract:

In the present world, predicting rainfall is considered to be an essential and also a challenging task. Normally, the climate and rainfall are presumed to have non-linear as well as intricate phenomena. For predicting accurate rainfall, we necessitate advanced computer modeling and simulation. When there is an enhanced understanding of the spatial and temporal distribution of precipitation then it becomes enrichment to applications such as hydrologic, climatic and ecological. Conversely, there may be some kind of challenges occur in the community due to some application which results in the absence of consistent precipitation observation in remote and also emerging region. This survey paper provides a multifarious collection of methodologies which are epitomized by various researchers for predicting the rainfall. It also gives information about some technique to forecast rainfall, which is appropriate to all methods like numerical, traditional and statistical.

Keywords: Satellite Image, Segmentation, Feature Extraction, Classification, Clustering, Precipitation Estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3225
2932 A Dynamic Decision Model for Vertical Handoffs across Heterogeneous Wireless Networks

Authors: Pramod Goyal, S. K. Saxena

Abstract:

The convergence of heterogeneous wireless access technologies characterizes the 4G wireless networks. In such converged systems, the seamless and efficient handoff between different access technologies (vertical handoff) is essential and remains a challenging problem. The heterogeneous co-existence of access technologies with largely different characteristics creates a decision problem of determining the “best" available network at “best" time to reduce the unnecessary handoffs. This paper proposes a dynamic decision model to decide the “best" network at “best" time moment to handoffs. The proposed dynamic decision model make the right vertical handoff decisions by determining the “best" network at “best" time among available networks based on, dynamic factors such as “Received Signal Strength(RSS)" of network and “velocity" of mobile station simultaneously with static factors like Usage Expense, Link capacity(offered bandwidth) and power consumption. This model not only meets the individual user needs but also improve the whole system performance by reducing the unnecessary handoffs.

Keywords: Dynamic decision model, Seamless handoff, Vertical handoff, Wireless networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2050
2931 Comprehensive Analysis of Data Mining Tools

Authors: S. Sarumathi, N. Shanthi

Abstract:

Due to the fast and flawless technological innovation there is a tremendous amount of data dumping all over the world in every domain such as Pattern Recognition, Machine Learning, Spatial Data Mining, Image Analysis, Fraudulent Analysis, World Wide Web etc., This issue turns to be more essential for developing several tools for data mining functionalities. The major aim of this paper is to analyze various tools which are used to build a resourceful analytical or descriptive model for handling large amount of information more efficiently and user friendly. In this survey the diverse tools are illustrated with their extensive technical paradigm, outstanding graphical interface and inbuilt multipath algorithms in which it is very useful for handling significant amount of data more indeed.

Keywords: Classification, Clustering, Data Mining, Machine learning, Visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2439
2930 DACS3:Embedding Individual Ant Behavior in Ant Colony System

Authors: Zulaiha Ali Othman, Helmi Md Rais, Abdul Razak Hamdan

Abstract:

Ants are fascinating creatures that demonstrate the ability to find food and bring it back to their nest. Their ability as a colony, to find paths to food sources has inspired the development of algorithms known as Ant Colony Systems (ACS). The principle of cooperation forms the backbone of such algorithms, commonly used to find solutions to problems such as the Traveling Salesman Problem (TSP). Ants communicate to each other through chemical substances called pheromones. Modeling individual ants- ability to manipulate this substance can help an ACS find the best solution. This paper introduces a Dynamic Ant Colony System with threelevel updates (DACS3) that enhance an existing ACS. Experiments were conducted to observe single ant behavior in a colony of Malaysian House Red Ants. Such behavior was incorporated into the DACS3 algorithm. We benchmark the performance of DACS3 versus DACS on TSP instances ranging from 14 to 100 cities. The result shows that the DACS3 algorithm can achieve shorter distance in most cases and also performs considerably faster than DACS.

Keywords: Dynamic Ant Colony System (DACS), Traveling Salesmen Problem (TSP), Optimization, Swarm Intelligent.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
2929 A Genetic Algorithm Based Permutation and Non-Permutation Scheduling Heuristics for Finite Capacity Material Requirement Planning Problem

Authors: Watchara Songserm, Teeradej Wuttipornpun

Abstract:

This paper presents a genetic algorithm based permutation and non-permutation scheduling heuristics (GAPNP) to solve a multi-stage finite capacity material requirement planning (FCMRP) problem in automotive assembly flow shop with unrelated parallel machines. In the algorithm, the sequences of orders are iteratively improved by the GA characteristics, whereas the required operations are scheduled based on the presented permutation and non-permutation heuristics. Finally, a linear programming is applied to minimize the total cost. The presented GAPNP algorithm is evaluated by using real datasets from automotive companies. The required parameters for GAPNP are intently tuned to obtain a common parameter setting for all case studies. The results show that GAPNP significantly outperforms the benchmark algorithm about 30% on average.

Keywords: Finite capacity MRP, genetic algorithm, linear programming, flow shop, unrelated parallel machines, application in industries.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1107
2928 Fault Zone Detection on Advanced Series Compensated Transmission Line using Discrete Wavelet Transform and SVM

Authors: Renju Gangadharan, G. N. Pillai, Indra Gupta

Abstract:

In this paper a novel method for finding the fault zone on a Thyristor Controlled Series Capacitor (TCSC) incorporated transmission line is presented. The method makes use of the Support Vector Machine (SVM), used in the classification mode to distinguish between the zones, before or after the TCSC. The use of Discrete Wavelet Transform is made to prepare the features which would be given as the input to the SVM. This method was tested on a 400 kV, 50 Hz, 300 Km transmission line and the results were highly accurate.

Keywords: Flexible ac transmission system (FACTS), thyristorcontrolled series-capacitor (TCSC), discrete wavelet transforms(DWT), support vector machine (SVM).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1732
2927 Application of Load Transfer Technique for Distribution Power Flow Analysis

Authors: Udomsak Thongkrajay, Padej Pao-La-Or, Thanatchai Kulworawanichpong

Abstract:

Installation of power compensation equipment in some cases places additional buses into the system. Therefore, a total number of power flow equations and voltage unknowns increase due to additional locations of installed devices. In this circumstance, power flow calculation is more complicated. It may result in a computational convergence problem. This paper presents a power flow calculation by using Newton-Raphson iterative method together with the proposed load transfer technique. This concept is to eliminate additional buses by transferring installed loads at the new buses to existing two adjacent buses. Thus, the total number of power flow equations is not changed. The overall computational speed is expectedly shorter than that of solving the problem without applying the load transfer technique. A 15-bus test system is employed for test to evaluate the effectiveness of the proposed load transfer technique. As a result, the total number of iteration required and execution time is significantly reduced.

Keywords: Load transfer technique, Newton-Raphson power flow, ill-condition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1651
2926 Optimization Approaches for a Complex Dairy Farm Simulation Model

Authors: Jagannath Aryal, Don Kulasiri, Dishi Liu

Abstract:

This paper describes the optimization of a complex dairy farm simulation model using two quite different methods of optimization, the Genetic algorithm (GA) and the Lipschitz Branch-and-Bound (LBB) algorithm. These techniques have been used to improve an agricultural system model developed by Dexcel Limited, New Zealand, which describes a detailed representation of pastoral dairying scenarios and contains an 8-dimensional parameter space. The model incorporates the sub-models of pasture growth and animal metabolism, which are themselves complex in many cases. Each evaluation of the objective function, a composite 'Farm Performance Index (FPI)', requires simulation of at least a one-year period of farm operation with a daily time-step, and is therefore computationally expensive. The problem of visualization of the objective function (response surface) in high-dimensional spaces is also considered in the context of the farm optimization problem. Adaptations of the sammon mapping and parallel coordinates visualization are described which help visualize some important properties of the model-s output topography. From this study, it is found that GA requires fewer function evaluations in optimization than the LBB algorithm.

Keywords: Genetic Algorithm, Linux Cluster, LipschitzBranch-and-Bound, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2109
2925 An Advanced Method for Speech Recognition

Authors: Meysam Mohamad pour, Fardad Farokhi

Abstract:

In this paper in consideration of each available techniques deficiencies for speech recognition, an advanced method is presented that-s able to classify speech signals with the high accuracy (98%) at the minimum time. In the presented method, first, the recorded signal is preprocessed that this section includes denoising with Mels Frequency Cepstral Analysis and feature extraction using discrete wavelet transform (DWT) coefficients; Then these features are fed to Multilayer Perceptron (MLP) network for classification. Finally, after training of neural network effective features are selected with UTA algorithm.

Keywords: Multilayer perceptron (MLP) neural network, Discrete Wavelet Transform (DWT) , Mels Scale Frequency Filter , UTA algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2366
2924 Use of Hierarchical Temporal Memory Algorithm in Heart Attack Detection

Authors: Tesnim Charrad, Kaouther Nouira, Ahmed Ferchichi

Abstract:

In order to reduce the number of deaths due to heart problems, we propose the use of Hierarchical Temporal Memory Algorithm (HTM) which is a real time anomaly detection algorithm. HTM is a cortical learning algorithm based on neocortex used for anomaly detection. In other words, it is based on a conceptual theory of how the human brain can work. It is powerful in predicting unusual patterns, anomaly detection and classification. In this paper, HTM have been implemented and tested on ECG datasets in order to detect cardiac anomalies. Experiments showed good performance in terms of specificity, sensitivity and execution time.

Keywords: HTM, Real time anomaly detection, ECG, Cardiac Anomalies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 795
2923 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: Imperfect channel state information, outage probability, multiuser- multi input single output.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1115
2922 Trajectory Estimation and Control of Vehicle using Neuro-Fuzzy Technique

Authors: B. Selma, S. Chouraqui

Abstract:

Nonlinear system identification is becoming an important tool which can be used to improve control performance. This paper describes the application of adaptive neuro-fuzzy inference system (ANFIS) model for controlling a car. The vehicle must follow a predefined path by supervised learning. Backpropagation gradient descent method was performed to train the ANFIS system. The performance of the ANFIS model was evaluated in terms of training performance and classification accuracies and the results confirmed that the proposed ANFIS model has potential in controlling the non linear system.

Keywords: Adaptive neuro-fuzzy inference system (ANFIS), Fuzzy logic, neural network, nonlinear system, control

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1785
2921 Advanced Jet Trainer and Light Attack Aircraft Selection Using Composite Programming in Multiple Criteria Decision Making Analysis Method

Authors: C. Ardil

Abstract:

In this paper, composite programming is discussed for aircraft evaluation and selection problem using the multiple criteria decision analysis method. The decision criteria and aircraft alternatives were identified from the literature review. The importance of criteria weights was determined by the standard deviation method. The proposed model is applied to a practical decision problem for evaluating and selecting advanced jet trainer and light attack aircraft. The proposed technique gives robust and efficient results in modeling multiple criteria decisions. As a result of composite programming analysis, Hürjet, an advanced jet trainer and light attack aircraft alternative (a3), was chosen as the most suitable aircraft candidate.  

Keywords: composite programming, additive weighted model, multiplicative weighted model, multiple criteria decision making analysis, MCDMA, aircraft selection, advanced jet trainer and light attack aircraft, M-346, FA-50, Hürjet

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488
2920 Solving Weighted Number of Operation Plus Processing Time Due-Date Assignment, Weighted Scheduling and Process Planning Integration Problem Using Genetic and Simulated Annealing Search Methods

Authors: Halil Ibrahim Demir, Caner Erden, Mumtaz Ipek, Ozer Uygun

Abstract:

Traditionally, the three important manufacturing functions, which are process planning, scheduling and due-date assignment, are performed separately and sequentially. For couple of decades, hundreds of studies are done on integrated process planning and scheduling problems and numerous researches are performed on scheduling with due date assignment problem, but unfortunately the integration of these three important functions are not adequately addressed. Here, the integration of these three important functions is studied by using genetic, random-genetic hybrid, simulated annealing, random-simulated annealing hybrid and random search techniques. As well, the importance of the integration of these three functions and the power of meta-heuristics and of hybrid heuristics are studied.

Keywords: Process planning, weighted scheduling, weighted due-date assignment, genetic search, simulated annealing, hybrid meta-heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1588
2919 Classification of Fuzzy Petri Nets, and Their Applications

Authors: M.H.Aziz, Erik L.J.Bohez, Manukid Parnichkun, Chanchal Saha

Abstract:

Petri Net (PN) has proven to be effective graphical, mathematical, simulation, and control tool for Discrete Event Systems (DES). But, with the growth in the complexity of modern industrial, and communication systems, PN found themselves inadequate to address the problems of uncertainty, and imprecision in data. This gave rise to amalgamation of Fuzzy logic with Petri nets and a new tool emerged with the name of Fuzzy Petri Nets (FPN). Although there had been a lot of research done on FPN and a number of their applications have been anticipated, but their basic types and structure are still ambiguous. Therefore, in this research, an effort is made to categorize FPN according to their structure and algorithms Further, literature review of the applications of FPN in the light of their classifications has been done.

Keywords: Discrete event systems, Fuzzy logic, Fuzzy Petri nets, and Petri nets.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1632
2918 Application of Granular Computing Paradigm in Knowledge Induction

Authors: Iftikhar U. Sikder

Abstract:

This paper illustrates an application of granular computing approach, namely rough set theory in data mining. The paper outlines the formalism of granular computing and elucidates the mathematical underpinning of rough set theory, which has been widely used by the data mining and the machine learning community. A real-world application is illustrated, and the classification performance is compared with other contending machine learning algorithms. The predictive performance of the rough set rule induction model shows comparative success with respect to other contending algorithms.

Keywords: Concept approximation, granular computing, reducts, rough set theory, rule induction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
2917 Packet Losses Interpretation in Mobile Internet

Authors: Hossam el-ddin Mostafa, Pavel Čičak

Abstract:

The mobile users with Laptops need to have an efficient access to i.e. their home personal data or to the Internet from any place in the world, regardless of their location or point of attachment, especially while roaming outside the home subnet. An efficient interpretation of packet losses problem that is encountered from this roaming is to the centric of all aspects in this work, to be over-highlighted. The main previous works, such as BER-systems, Amigos, and ns-2 implementation that are considered to be in conjunction with that problem under study are reviewed and discussed. Their drawbacks and limitations, of stopping only at monitoring, and not to provide an actual solution for eliminating or even restricting these losses, are mentioned. Besides that, the framework around which we built a Triple-R sequence as a costeffective solution to eliminate the packet losses and bridge the gap between subnets, an area that until now has been largely neglected, is presented. The results show that, in addition to the high bit error rate of wireless mobile networks, mainly the low efficiency of mobile-IP registration procedure is a direct cause of these packet losses. Furthermore, the output of packet losses interpretation resulted an illustrated triangle of the registration process. This triangle should be further researched and analyzed in our future work.

Keywords: Amigos, BER-systems, ns-2 implementation, packetlosses, registration process, roaming.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1471
2916 Weighted k-Nearest-Neighbor Techniques for High Throughput Screening Data

Authors: Kozak K, M. Kozak, K. Stapor

Abstract:

The k-nearest neighbors (knn) is a simple but effective method of classification. In this paper we present an extended version of this technique for chemical compounds used in High Throughput Screening, where the distances of the nearest neighbors can be taken into account. Our algorithm uses kernel weight functions as guidance for the process of defining activity in screening data. Proposed kernel weight function aims to combine properties of graphical structure and molecule descriptors of screening compounds. We apply the modified knn method on several experimental data from biological screens. The experimental results confirm the effectiveness of the proposed method.

Keywords: biological screening, kernel methods, KNN, QSAR

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2275
2915 On-Road Text Detection Platform for Driver Assistance Systems

Authors: Guezouli Larbi, Belkacem Soundes

Abstract:

The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered as a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.

Keywords: Text detection, CNN, PZM, deep learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163
2914 An Ant Colony Optimization for Dynamic JobScheduling in Grid Environment

Authors: Siriluck Lorpunmanee, Mohd Noor Sap, Abdul Hanan Abdullah, Chai Chompoo-inwai

Abstract:

Grid computing is growing rapidly in the distributed heterogeneous systems for utilizing and sharing large-scale resources to solve complex scientific problems. Scheduling is the most recent topic used to achieve high performance in grid environments. It aims to find a suitable allocation of resources for each job. A typical problem which arises during this task is the decision of scheduling. It is about an effective utilization of processor to minimize tardiness time of a job, when it is being scheduled. This paper, therefore, addresses the problem by developing a general framework of grid scheduling using dynamic information and an ant colony optimization algorithm to improve the decision of scheduling. The performance of various dispatching rules such as First Come First Served (FCFS), Earliest Due Date (EDD), Earliest Release Date (ERD), and an Ant Colony Optimization (ACO) are compared. Moreover, the benefit of using an Ant Colony Optimization for performance improvement of the grid Scheduling is also discussed. It is found that the scheduling system using an Ant Colony Optimization algorithm can efficiently and effectively allocate jobs to proper resources.

Keywords: Grid computing, Distributed heterogeneous system, Ant colony optimization algorithm, Grid scheduling, Dispatchingrules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2706
2913 Eclectic Rule-Extraction from Support Vector Machines

Authors: Nahla Barakat, Joachim Diederich

Abstract:

Support vector machines (SVMs) have shown superior performance compared to other machine learning techniques, especially in classification problems. Yet one limitation of SVMs is the lack of an explanation capability which is crucial in some applications, e.g. in the medical and security domains. In this paper, a novel approach for eclectic rule-extraction from support vector machines is presented. This approach utilizes the knowledge acquired by the SVM and represented in its support vectors as well as the parameters associated with them. The approach includes three stages; training, propositional rule-extraction and rule quality evaluation. Results from four different experiments have demonstrated the value of the approach for extracting comprehensible rules of high accuracy and fidelity.

Keywords: Data mining, hybrid rule-extraction algorithms, medical diagnosis, SVMs

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1708
2912 Ontology-based Concept Weighting for Text Documents

Authors: Hmway Hmway Tar, Thi Thi Soe Nyaunt

Abstract:

Documents clustering become an essential technology with the popularity of the Internet. That also means that fast and high-quality document clustering technique play core topics. Text clustering or shortly clustering is about discovering semantically related groups in an unstructured collection of documents. Clustering has been very popular for a long time because it provides unique ways of digesting and generalizing large amounts of information. One of the issues of clustering is to extract proper feature (concept) of a problem domain. The existing clustering technology mainly focuses on term weight calculation. To achieve more accurate document clustering, more informative features including concept weight are important. Feature Selection is important for clustering process because some of the irrelevant or redundant feature may misguide the clustering results. To counteract this issue, the proposed system presents the concept weight for text clustering system developed based on a k-means algorithm in accordance with the principles of ontology so that the important of words of a cluster can be identified by the weight values. To a certain extent, it has resolved the semantic problem in specific areas.

Keywords: Clustering, Concept Weight, Document clustering, Feature Selection, Ontology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2406
2911 Analysis of the Root Causes of Transformer Bushing Failures

Authors: E. A. Feilat, I. A. Metwally, S. Al-Matri, A. S. Al-Abri

Abstract:

This paper presents the results of a comprehensive investigation of five blackouts that occurred on 28 August to 8 September 2011 due to bushing failures of the 132/33 kV, 125 MVA transformers at JBB Ali Grid station. The investigation aims to explore the root causes of the bushing failures and come up with recommendations that help in rectifying the problem and avoiding the reoccurrence of similar type of incidents. The incident reports about the failed bushings and the SCADA reports at this grid station were examined and analyzed. Moreover, comprehensive power quality field measurements at ten 33/11 kV substations (S/Ss) in JBB Ali area were conducted, and frequency scans were performed to verify any harmonic resonance frequencies due to power factor correction capacitors. Furthermore, the daily operations of the on-load tap changers (OLTCs) of both the 125 MVA and 20 MVA transformers at JBB Ali Grid station have been analyzed. The investigation showed that the five bushing failures were due to a local problem, i.e. internal degradation of the bushing insulation. This has been confirmed by analyzing the time interval between successive OLTC operations of the faulty grid transformers. It was also found that monitoring the number of OLTC operations can help in predicting bushing failure.

Keywords: Modeling and simulation, power system, transformer, bushing, OLTC, power quality, partial discharge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10777
2910 Improvement of Central Composite Design in Modeling and Optimization of Simulation Experiments

Authors: A. Nuchitprasittichai, N. Lerdritsirikoon, T. Khamsing

Abstract:

Simulation modeling can be used to solve real world problems. It provides an understanding of a complex system. To develop a simplified model of process simulation, a suitable experimental design is required to be able to capture surface characteristics. This paper presents the experimental design and algorithm used to model the process simulation for optimization problem. The CO2 liquefaction based on external refrigeration with two refrigeration circuits was used as a simulation case study. Latin Hypercube Sampling (LHS) was purposed to combine with existing Central Composite Design (CCD) samples to improve the performance of CCD in generating the second order model of the system. The second order model was then used as the objective function of the optimization problem. The results showed that adding LHS samples to CCD samples can help capture surface curvature characteristics. Suitable number of LHS sample points should be considered in order to get an accurate nonlinear model with minimum number of simulation experiments.

Keywords: Central composite design, CO2 liquefaction, Latin Hypercube Sampling, simulation – based optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 741