Search results for: Data mining classification algorithms
7747 Improvement of the Q-System Using the Rock Engineering System: A Case Study of Water Conveyor Tunnel of Azad Dam
Authors: S. Golmohammadi, M. Noorian Bidgoli
Abstract:
Because the status and mechanical parameters of discontinuities in the rock mass are included in the calculations, various methods of rock engineering classification are often used as a starting point for the design of different types of structures. The Q-system is one of the most frequently used methods for stability analysis and determination of support systems of underground structures in rock, including tunnel. In this method, six main parameters of the rock mass, namely, the Rock Quality Designation (RQD), joint set number (Jn), joint roughness number (Jr), joint alteration number (Ja), joint water parameter (Jw) and Stress Reduction Factor (SRF) are required. In this regard, in order to achieve a reasonable and optimal design, identifying the effective parameters for the stability of the mentioned structures is one of the most important goals and the most necessary actions in rock engineering. Therefore, it is necessary to study the relationships between the parameters of a system and how they interact with each other and, ultimately, the whole system. In this research, it has been attempted to determine the most effective parameters (key parameters) from the six parameters of rock mass in the Q-system using the Rock Engineering System (RES) method to improve the relationships between the parameters in the calculation of the Q value. The RES system is, in fact, a method by which one can determine the degree of cause and effect of a system's parameters by making an interaction matrix. In this research, the geomechanical data collected from the water conveyor tunnel of Azad Dam were used to make the interaction matrix of the Q-system. For this purpose, instead of using the conventional methods that are always accompanied by defects such as uncertainty, the Q-system interaction matrix is coded using a technique that is actually a statistical analysis of the data and determining the correlation coefficient between them. So, the effect of each parameter on the system is evaluated with greater certainty. The results of this study show that the formed interaction matrix provides a reasonable estimate of the effective parameters in the Q-system. Among the six parameters of the Q-system, the SRF and Jr parameters have the maximum and minimum impact on the system, respectively, and also the RQD and Jw parameters have the maximum and minimum impact on the system, respectively. Therefore, by developing this method, we can obtain a more accurate relation to the rock mass classification by weighting the required parameters in the Q-system.
Keywords: Q-system, Rock Engineering System, statistical analysis, rock mass, tunnel.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2957746 A Robust Visual Tracking Algorithm with Low-Rank Region Covariance
Authors: Songtao Wu, Yuesheng Zhu, Ziqiang Sun
Abstract:
Region covariance (RC) descriptor is an effective and efficient feature for visual tracking. Current RC-based tracking algorithms use the whole RC matrix to track the target in video directly. However, there exist some issues for these whole RCbased algorithms. If some features are contaminated, the whole RC will become unreliable, which results in lost object-tracking. In addition, if some features are very discriminative to the background, other features are still processed and thus reduce the efficiency. In this paper a new robust tracking method is proposed, in which the whole RC matrix is decomposed into several low rank matrices. Those matrices are dynamically chosen and processed so as to achieve a good tradeoff between discriminability and complexity. Experimental results have shown that our method is more robust to complex environment changes, especially either when occlusion happens or when the background is similar to the target compared to other RC-based methods.Keywords: Visual tracking, region covariance descriptor, lowrankregion covariance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15847745 Error-Robust Nature of Genome Profiling Applied for Clustering of Species Demonstrated by Computer Simulation
Authors: Shamim Ahmed Koichi Nishigaki
Abstract:
Genome profiling (GP), a genotype based technology, which exploits random PCR and temperature gradient gel electrophoresis, has been successful in identification/classification of organisms. In this technology, spiddos (Species identification dots) and PaSS (Pattern similarity score) were employed for measuring the closeness (or distance) between genomes. Based on the closeness (PaSS), we can buildup phylogenetic trees of the organisms. We noticed that the topology of the tree is rather robust against the experimental fluctuation conveyed by spiddos. This fact was confirmed quantitatively in this study by computer-simulation, providing the limit of the reliability of this highly powerful methodology. As a result, we could demonstrate the effectiveness of the GP approach for identification/classification of organisms.
Keywords: Fluctuation, Genome profiling (GP), Pattern similarity score (PaSS), Robustness, Spiddos-shift.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15397744 Multiuser Detection in CDMA Fast Fading Multipath Channel using Heuristic Genetic Algorithms
Authors: Muhammad Naeem, Syed Ismail Shah, Habibullah Jamal
Abstract:
In this paper, a simple heuristic genetic algorithm is used for Multistage Multiuser detection in fast fading environments. Multipath channels, multiple access interference (MAI) and near far effect cause the performance of the conventional detector to degrade. Heuristic Genetic algorithms, a rapidly growing area of artificial intelligence, uses evolutionary programming for initial search, which not only helps to converge the solution towards near optimal performance efficiently but also at a very low complexity as compared with optimal detector. This holds true for Additive White Gaussian Noise (AWGN) and multipath fading channels. Experimental results are presented to show the superior performance of the proposed techque over the existing methods.Keywords: Genetic Algorithm (GA), Multiple AccessInterference (MAI), Multistage Detectors (MSD), SuccessiveInterference Cancellation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20497743 Educase – Intelligent System for Pedagogical Advising Using Case-Based Reasoning
Authors: Elionai Moura, José A. da Cunha, César Analide
Abstract:
This paper introduces a proposal scheme for an Intelligent System applied to Pedagogical Advising using Case-Based Reasoning, to find consolidated solutions before used for the new problems, making easier the task of advising students to the pedagogical staff. We do intend, through this work, introduce the motivation behind the choices for this system structure, justifying the development of an incremental and smart web system who learns bests solutions for new cases when it’s used, showing technics and technology.
Keywords: Case-based Reasoning, Pedagogical Advising, Educational Data-Mining (EDM), Machine Learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20837742 Segmentation of Korean Words on Korean Road Signs
Authors: Lae-Jeong Park, Kyusoo Chung, Jungho Moon
Abstract:
This paper introduces an effective method of segmenting Korean text (place names in Korean) from a Korean road sign image. A Korean advanced directional road sign is composed of several types of visual information such as arrows, place names in Korean and English, and route numbers. Automatic classification of the visual information and extraction of Korean place names from the road sign images make it possible to avoid a lot of manual inputs to a database system for management of road signs nationwide. We propose a series of problem-specific heuristics that correctly segments Korean place names, which is the most crucial information, from the other information by leaving out non-text information effectively. The experimental results with a dataset of 368 road sign images show 96% of the detection rate per Korean place name and 84% per road sign image.Keywords: Segmentation, road signs, characters, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27517741 Optimizing Spatial Trend Detection By Artificial Immune Systems
Authors: M. Derakhshanfar, B. Minaei-Bidgoli
Abstract:
Spatial trends are one of the valuable patterns in geo databases. They play an important role in data analysis and knowledge discovery from spatial data. A spatial trend is a regular change of one or more non spatial attributes when spatially moving away from a start object. Spatial trend detection is a graph search problem therefore heuristic methods can be good solution. Artificial immune system (AIS) is a special method for searching and optimizing. AIS is a novel evolutionary paradigm inspired by the biological immune system. The models based on immune system principles, such as the clonal selection theory, the immune network model or the negative selection algorithm, have been finding increasing applications in fields of science and engineering. In this paper, we develop a novel immunological algorithm based on clonal selection algorithm (CSA) for spatial trend detection. We are created neighborhood graph and neighborhood path, then select spatial trends that their affinity is high for antibody. In an evolutionary process with artificial immune algorithm, affinity of low trends is increased with mutation until stop condition is satisfied.Keywords: Spatial Data Mining, Spatial Trend Detection, Heuristic Methods, Artificial Immune System, Clonal Selection Algorithm (CSA)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20477740 Energy Management System and Interactive Functions of Smart Plug for Smart Home
Authors: Win Thandar Soe, Innocent Mpawenimana, Mathieu Di Fazio, Cécile Belleudy, Aung Ze Ya
Abstract:
Intelligent electronic equipment and automation network is the brain of high-tech energy management systems in critical role of smart homes dominance. Smart home is a technology integration for greater comfort, autonomy, reduced cost, and energy saving as well. These services can be provided to home owners for managing their home appliances locally or remotely and consequently allow them to automate intelligently and responsibly their consumption by individual or collective control systems. In this study, three smart plugs are described and one of them tested on typical household appliances. This article proposes to collect the data from the wireless technology and to extract some smart data for energy management system. This smart data is to quantify for three kinds of load: intermittent load, phantom load and continuous load. Phantom load is a waste power that is one of unnoticed power of each appliance while connected or disconnected to the main. Intermittent load and continuous load take in to consideration the power and using time of home appliances. By analysing the classification of loads, this smart data will be provided to reduce the communication of wireless sensor network for energy management system.Keywords: Energy management, load profile, smart plug, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13987739 Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study
Authors: Nazif Çalış, Murat Erişoğlu, Hamza Erol, Tayfun Servi
Abstract:
In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.Keywords: Self Organizing Mixture Network, MixtureDiscriminant Analysis, Waveform Datasets, Glass Identification, Mixture of Multivariate Normal Distributions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15177738 Preliminary Evaluation of Decommissioning Wastes for the First Commercial Nuclear Power Reactor in South Korea
Authors: Kyomin Lee, Joohee Kim, Sangho Kang
Abstract:
The commercial nuclear power reactor in South Korea, Kori Unit 1, which was a 587 MWe pressurized water reactor that started operation since 1978, was permanently shut down in June 2017 without an additional operating license extension. The Kori 1 Unit is scheduled to become the nuclear power unit to enter the decommissioning phase. In this study, the preliminary evaluation of the decommissioning wastes for the Kori Unit 1 was performed based on the following series of process: firstly, the plant inventory is investigated based on various documents (i.e., equipment/ component list, construction records, general arrangement drawings). Secondly, the radiological conditions of systems, structures and components (SSCs) are established to estimate the amount of radioactive waste by waste classification. Third, the waste management strategies for Kori Unit 1 including waste packaging are established. Forth, selection of the proper decontamination and dismantling (D&D) technologies is made considering the various factors. Finally, the amount of decommissioning waste by classification for Kori 1 is estimated using the DeCAT program, which was developed by KEPCO-E&C for a decommissioning cost estimation. The preliminary evaluation results have shown that the expected amounts of decommissioning wastes were less than about 2% and 8% of the total wastes generated (i.e., sum of clean wastes and radwastes) before/after waste processing, respectively, and it was found that the majority of contaminated material was carbon or alloy steel and stainless steel. In addition, within the range of availability of information, the results of the evaluation were compared with the results from the various decommissioning experiences data or international/national decommissioning study. The comparison results have shown that the radioactive waste amount from Kori Unit 1 decommissioning were much less than those from the plants decommissioned in U.S. and were comparable to those from the plants in Europe. This result comes from the difference of disposal cost and clearance criteria (i.e., free release level) between U.S. and non-U.S. The preliminary evaluation performed using the methodology established in this study will be useful as a important information in establishing the decommissioning planning for the decommissioning schedule and waste management strategy establishment including the transportation, packaging, handling, and disposal of radioactive wastes.
Keywords: Characterization, classification, decommissioning, decontamination and dismantling, Kori 1, radioactive waste.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14827737 The Impact of Trade on Social Development
Authors: Umut Gunduz, Mehtap Hisarciklilar, Tolga Kaya
Abstract:
Studies revealing the positive relationship between trade and income are often criticized with the argument that “development should mean more than rising incomes". Taking this argument as a base and utilizing panel data, Davies and Quinlivan [1] have demonstrated that increases in trade are positively associated with future increases in social welfare as measured by the Human Development Index (HDI). The purpose of this study is twofold: Firstly, utilizing an income based country classification; it is aimed to investigate whether the positive association between foreign trade and HDI is valid within all country groups. Secondly, keeping the same categorization as a base; it is aimed to reveal whether the positive link between trade and HDI still exists when the income components of the index are excluded. Employing a panel data framework of 106 countries, this study reveals that the positive link between trade and human development is valid only for high and medium income countries. Moreover, the positive link between trade and human development diminishes in lower-medium income countries when only non-income components of the index are taken into consideration.Keywords: HDI, foreign trade, development, panel data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27157736 Computing Maximum Uniquely Restricted Matchings in Restricted Interval Graphs
Authors: Swapnil Gupta, C. Pandu Rangan
Abstract:
A uniquely restricted matching is defined to be a matching M whose matched vertices induces a sub-graph which has only one perfect matching. In this paper, we make progress on the open question of the status of this problem on interval graphs (graphs obtained as the intersection graph of intervals on a line). We give an algorithm to compute maximum cardinality uniquely restricted matchings on certain sub-classes of interval graphs. We consider two sub-classes of interval graphs, the former contained in the latter, and give O(|E|^2) time algorithms for both of them. It is to be noted that both sub-classes are incomparable to proper interval graphs (graphs obtained as the intersection graph of intervals in which no interval completely contains another interval), on which the problem can be solved in polynomial time.Keywords: Uniquely restricted matching, interval graph, design and analysis of algorithms, matching, induced matching, witness counting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15477735 Attention Multiple Instance Learning for Cancer Tissue Classification in Digital Histopathology Images
Authors: Afaf Alharbi, Qianni Zhang
Abstract:
The identification of malignant tissue in histopathological slides holds significant importance in both clinical settings and pathology research. This paper presents a methodology aimed at automatically categorizing cancerous tissue through the utilization of a multiple instance learning framework. This framework is specifically developed to acquire knowledge of the Bernoulli distribution of the bag label probability by employing neural networks. Furthermore, we put forward a neural network-based permutation-invariant aggregation operator, equivalent to attention mechanisms, which is applied to the multi-instance learning network. Through empirical evaluation on an openly available colon cancer histopathology dataset, we provide evidence that our approach surpasses various conventional deep learning methods.
Keywords: Attention Multiple Instance Learning, Multiple Instance Learning, transfer learning, histopathological slides, cancer tissue classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2267734 Clustered Signatures for Modeling and Recognizing 3D Rigid Objects
Authors: H. B. Darbandi, M. R. Ito, J. Little
Abstract:
This paper describes a probabilistic method for three-dimensional object recognition using a shared pool of surface signatures. This technique uses flatness, orientation, and convexity signatures that encode the surface of a free-form object into three discriminative vectors, and then creates a shared pool of data by clustering the signatures using a distance function. This method applies the Bayes-s rule for recognition process, and it is extensible to a large collection of three-dimensional objects.Keywords: Object recognition, modeling, classification, computer vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12787733 Foot Recognition Using Deep Learning for Knee Rehabilitation
Authors: Rakkrit Duangsoithong, Jermphiphut Jaruenpunyasak, Alba Garcia
Abstract:
The use of foot recognition can be applied in many medical fields such as the gait pattern analysis and the knee exercises of patients in rehabilitation. Generally, a camera-based foot recognition system is intended to capture a patient image in a controlled room and background to recognize the foot in the limited views. However, this system can be inconvenient to monitor the knee exercises at home. In order to overcome these problems, this paper proposes to use the deep learning method using Convolutional Neural Networks (CNNs) for foot recognition. The results are compared with the traditional classification method using LBP and HOG features with kNN and SVM classifiers. According to the results, deep learning method provides better accuracy but with higher complexity to recognize the foot images from online databases than the traditional classification method.Keywords: Convolutional neural networks, deep learning, foot recognition, knee rehabilitation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14357732 Optimizing Mobile Agents Migration Based on Decision Tree Learning
Authors: Yasser k. Ali, Hesham N. Elmahdy, Sanaa El Olla Hanfy Ahmed
Abstract:
Mobile agents are a powerful approach to develop distributed systems since they migrate to hosts on which they have the resources to execute individual tasks. In a dynamic environment like a peer-to-peer network, Agents have to be generated frequently and dispatched to the network. Thus they will certainly consume a certain amount of bandwidth of each link in the network if there are too many agents migration through one or several links at the same time, they will introduce too much transferring overhead to the links eventually, these links will be busy and indirectly block the network traffic, therefore, there is a need of developing routing algorithms that consider about traffic load. In this paper we seek to create cooperation between a probabilistic manner according to the quality measure of the network traffic situation and the agent's migration decision making to the next hop based on decision tree learning algorithms.
Keywords: Agent Migration, Decision Tree learning, ID3 algorithm, Naive Bayes Classifier
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19917731 Multi-Temporal Urban Land Cover Mapping Using Spectral Indices
Authors: Mst Ilme Faridatul, Bo Wu
Abstract:
Multi-temporal urban land cover mapping is of paramount importance for monitoring urban sprawl and managing the ecological environment. For diversified urban activities, it is challenging to map land covers in a complex urban environment. Spectral indices have proved to be effective for mapping urban land covers. To improve multi-temporal urban land cover classification and mapping, we evaluate the performance of three spectral indices, e.g. modified normalized difference bare-land index (MNDBI), tasseled cap water and vegetation index (TCWVI) and shadow index (ShDI). The MNDBI is developed to evaluate its performance of enhancing urban impervious areas by separating bare lands. A tasseled cap index, TCWVI is developed to evaluate its competence to detect vegetation and water simultaneously. The ShDI is developed to maximize the spectral difference between shadows of skyscrapers and water and enhance water detection. First, this paper presents a comparative analysis of three spectral indices using Landsat Enhanced Thematic Mapper (ETM), Thematic Mapper (TM) and Operational Land Imager (OLI) data. Second, optimized thresholds of the spectral indices are imputed to classify land covers, and finally, their performance of enhancing multi-temporal urban land cover mapping is assessed. The results indicate that the spectral indices are competent to enhance multi-temporal urban land cover mapping and achieves an overall classification accuracy of 93-96%.
Keywords: Land cover, mapping, multi-temporal, spectral indices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11117730 Rule Insertion Technique for Dynamic Cell Structure Neural Network
Authors: Osama Elsarrar, Marjorie Darrah, Richard Devin
Abstract:
This paper discusses the idea of capturing an expert’s knowledge in the form of human understandable rules and then inserting these rules into a dynamic cell structure (DCS) neural network. The DCS is a form of self-organizing map that can be used for many purposes, including classification and prediction. This particular neural network is considered to be a topology preserving network that starts with no pre-structure, but assumes a structure once trained. The DCS has been used in mission and safety-critical applications, including adaptive flight control and health-monitoring in aerial vehicles. The approach is to insert expert knowledge into the DCS before training. Rules are translated into a pre-structure and then training data are presented. This idea has been demonstrated using the well-known Iris data set and it has been shown that inserting the pre-structure results in better accuracy with the same training.
Keywords: Neural network, rule extraction, rule insertion, self-organizing map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5317729 Partial 3D Reconstruction using Evolutionary Algorithms
Authors: Mónica Pérez-Meza, Rodrigo Montúfar-Chaveznava
Abstract:
When reconstructing a scenario, it is necessary to know the structure of the elements present on the scene to have an interpretation. In this work we link 3D scenes reconstruction to evolutionary algorithms through the vision stereo theory. We consider vision stereo as a method that provides the reconstruction of a scene using only a couple of images of the scene and performing some computation. Through several images of a scene, captured from different positions, vision stereo can give us an idea about the threedimensional characteristics of the world. Vision stereo usually requires of two cameras, making an analogy to the mammalian vision system. In this work we employ only a camera, which is translated along a path, capturing images every certain distance. As we can not perform all computations required for an exhaustive reconstruction, we employ an evolutionary algorithm to partially reconstruct the scene in real time. The algorithm employed is the fly algorithm, which employ “flies" to reconstruct the principal characteristics of the world following certain evolutionary rules.Keywords: 3D Reconstruction, Computer Vision, EvolutionaryAlgorithms, Vision Stereo.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18867728 Development of a Pipeline Monitoring System by Bio-mimetic Robots
Authors: Seung You Na, Daejung Shin, Jin Young Kim, Joo Hyun Jung, Yong-Gwan Won
Abstract:
To explore pipelines is one of various bio-mimetic robot applications. The robot may work in common buildings such as between ceilings and ducts, in addition to complicated and massive pipeline systems of large industrial plants. The bio-mimetic robot finds any troubled area or malfunction and then reports its data. Importantly, it can not only prepare for but also react to any abnormal routes in the pipeline. The pipeline monitoring tasks require special types of mobile robots. For an effective movement along a pipeline, the movement of the robot will be similar to that of insects or crawling animals. During its movement along the pipelines, a pipeline monitoring robot has an important task of finding the shapes of the approaching path on the pipes. In this paper we propose an effective solution to the pipeline pattern recognition, based on the fuzzy classification rules for the measured IR distance data.Keywords: Bio-mimetic robots, Plant pipes monitoring, Pipepattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16497727 Using Swarm Intelligence for Improving Accuracy of Fuzzy Classifiers
Authors: Hassan M. Elragal
Abstract:
This paper discusses a method for improving accuracy of fuzzy-rule-based classifiers using particle swarm optimization (PSO). Two different fuzzy classifiers are considered and optimized. The first classifier is based on Mamdani fuzzy inference system (M_PSO fuzzy classifier). The second classifier is based on Takagi- Sugeno fuzzy inference system (TS_PSO fuzzy classifier). The parameters of the proposed fuzzy classifiers including premise (antecedent) parameters, consequent parameters and structure of fuzzy rules are optimized using PSO. Experimental results show that higher classification accuracy can be obtained with a lower number of fuzzy rules by using the proposed PSO fuzzy classifiers. The performances of M_PSO and TS_PSO fuzzy classifiers are compared to other fuzzy based classifiersKeywords: Fuzzy classifier, Optimization of fuzzy systemparameters, Particle swarm optimization, Pattern classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23457726 A Taxonomy of Routing Protocols in Wireless Sensor Networks
Authors: A. Kardi, R. Zagrouba, M. Alqahtani
Abstract:
The Internet of Everything (IoE) presents today a very attractive and motivating field of research. It is basically based on Wireless Sensor Networks (WSNs) in which the routing task is the major analysis topic. In fact, it directly affects the effectiveness and the lifetime of the network. This paper, developed from recent works and based on extensive researches, proposes a taxonomy of routing protocols in WSNs. Our main contribution is that we propose a classification model based on nine classes namely application type, delivery mode, initiator of communication, network architecture, path establishment (route discovery), network topology (structure), protocol operation, next hop selection and latency-awareness and energy-efficient routing protocols. In order to provide a total classification pattern to serve as reference for network designers, each class is subdivided into possible subclasses, presented, and discussed using different parameters such as purposes and characteristics.
Keywords: WSNs, sensor, routing protocols, survey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10427725 Reformulations of Big Bang-Big Crunch Algorithm for Discrete Structural Design Optimization
Authors: O. Hasançebi, S. Kazemzadeh Azad
Abstract:
In the present study the efficiency of Big Bang-Big Crunch (BB-BC) algorithm is investigated in discrete structural design optimization. It is shown that a standard version of the BB-BC algorithm is sometimes unable to produce reasonable solutions to problems from discrete structural design optimization. Two reformulations of the algorithm, which are referred to as modified BB-BC (MBB-BC) and exponential BB-BC (EBB-BC), are introduced to enhance the capability of the standard algorithm in locating good solutions for steel truss and frame type structures, respectively. The performances of the proposed algorithms are experimented and compared to its standard version as well as some other algorithms over several practical design examples. In these examples, steel structures are sized for minimum weight subject to stress, stability and displacement limitations according to the provisions of AISC-ASD.Keywords: Structural optimization, discrete optimization, metaheuristics, big bang-big crunch (BB-BC) algorithm, design optimization of steel trusses and frames.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23907724 Delineating Concern Ground in Block Caving – Underground Mine Using Ground Penetrating Radar
Authors: Eric Sitorus, Septian Prahastudhi, Turgod Nainggolan, Erwin Riyanto
Abstract:
Mining by block or panel caving is a mining method that takes advantage of fractures within an ore body, coupled with gravity, to extract material from a predetermined column of ore. The caving column is weakened from beneath through the use of undercutting, after which the ore breaks up and is extracted from below in a continuous cycle. The nature of this method induces cyclical stresses on the pillars of excavations as stress is built up and released over time, which has a detrimental effect on both the installed ground support and the rock mass itself. Ground support capacity, especially on the production where excavation void ratio is highest, is subjected to heavy loading. Strain above threshold of the elongation of support capacity can yield resulting in damage to excavations. Geotechnical engineers must evaluate not only the remnant capacity of ground support systems but also investigate depth of rock mass yield within pillars, backs and floors. Ground Penetrating Radar (GPR) is a geophysical method that has the ability to evaluate rock mass damage using electromagnetic waves. This paper illustrates a case study from the Grasberg mining complex where non-invasive information on the depth of damage and condition of the remaining rock mass was required. GPR with 100 MHz antenna resolution was used to obtain images of the subsurface to determine rehabilitation requirements prior to recommencing production activities. The GPR surveys were used to calibrate the reflection coefficient response of varying rock mass conditions to known Rock Quality Designation (RQD) parameters observed at the mine. The calibrated GPR survey allowed site engineers to map subsurface conditions and plan rehabilitation accordingly.
Keywords: Block caving, ground penetrating radar, reflectivity, RQD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6727723 A New Method for Contour Approximation Using Basic Ramer Idea
Authors: Ali Abdrhman Ukasha
Abstract:
This paper presented two new efficient algorithms for contour approximation. The proposed algorithm is compared with Ramer (good quality), Triangle (faster) and Trapezoid (fastest) in this work; which are briefly described. Cartesian co-ordinates of an input contour are processed in such a manner that finally contours is presented by a set of selected vertices of the edge of the contour. In the paper the main idea of the analyzed procedures for contour compression is performed. For comparison, the mean square error and signal-to-noise ratio criterions are used. Computational time of analyzed methods is estimated depending on a number of numerical operations. Experimental results are obtained both in terms of image quality, compression ratios, and speed. The main advantages of the analyzed algorithm is small numbers of the arithmetic operations compared to the existing algorithms.Keywords: Polygonal approximation, Ramer, Triangle and Trapezoid methods.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18077722 Improving Spatiotemporal Change Detection: A High Level Fusion Approach for Discovering Uncertain Knowledge from Satellite Image Database
Authors: Wadii Boulila, Imed Riadh Farah, Karim Saheb Ettabaa, Basel Solaiman, Henda Ben Ghezala
Abstract:
This paper investigates the problem of tracking spa¬tiotemporal changes of a satellite image through the use of Knowledge Discovery in Database (KDD). The purpose of this study is to help a given user effectively discover interesting knowledge and then build prediction and decision models. Unfortunately, the KDD process for spatiotemporal data is always marked by several types of imperfections. In our paper, we take these imperfections into consideration in order to provide more accurate decisions. To achieve this objective, different KDD methods are used to discover knowledge in satellite image databases. Each method presents a different point of view of spatiotemporal evolution of a query model (which represents an extracted object from a satellite image). In order to combine these methods, we use the evidence fusion theory which considerably improves the spatiotemporal knowledge discovery process and increases our belief in the spatiotemporal model change. Experimental results of satellite images representing the region of Auckland in New Zealand depict the improvement in the overall change detection as compared to using classical methods.
Keywords: Knowledge discovery in satellite databases, knowledge fusion, data imperfection, data mining, spatiotemporal change detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15477721 A Specification-Based Approach for Retrieval of Reusable Business Component for Software Reuse
Authors: Meng Fanchao, Zhan Dechen, Xu Xiaofei
Abstract:
Software reuse can be considered as the most realistic and promising way to improve software engineering productivity and quality. Automated assistance for software reuse involves the representation, classification, retrieval and adaptation of components. The representation and retrieval of components are important to software reuse in Component-Based on Software Development (CBSD). However, current industrial component models mainly focus on the implement techniques and ignore the semantic information about component, so it is difficult to retrieve the components that satisfy user-s requirements. This paper presents a method of business component retrieval based on specification matching to solve the software reuse of enterprise information system. First, a business component model oriented reuse is proposed. In our model, the business data type is represented as sign data type based on XML, which can express the variable business data type that can describe the variety of business operations. Based on this model, we propose specification match relationships in two levels: business operation level and business component level. In business operation level, we use input business data types, output business data types and the taxonomy of business operations evaluate the similarity between business operations. In the business component level, we propose five specification matches between business components. To retrieval reusable business components, we propose the measure of similarity degrees to calculate the similarities between business components. Finally, a business component retrieval command like SQL is proposed to help user to retrieve approximate business components from component repository.Keywords: Business component, business operation, business data type, specification matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14097720 Wavelet Feature Selection Approach for Heart Murmur Classification
Authors: G. Venkata Hari Prasad, P. Rajesh Kumar
Abstract:
Phonocardiography is important in appraisal of congenital heart disease and pulmonary hypertension as it reflects the duration of right ventricular systoles. The systolic murmur in patients with intra-cardiac shunt decreases as pulmonary hypertension develops and may eventually disappear completely as the pulmonary pressure reaches systemic level. Phonocardiography and auscultation are non-invasive, low-cost, and accurate methods to assess heart disease. In this work an objective signal processing tool to extract information from phonocardiography signal using Wavelet is proposed to classify the murmur as normal or abnormal. Since the feature vector is large, a Binary Particle Swarm Optimization (PSO) with mutation for feature selection is proposed. The extracted features improve the classification accuracy and were tested across various classifiers including Naïve Bayes, kNN, C4.5, and SVM.Keywords: Phonocardiography, Coiflet, Feature selection, Particle Swarm Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24737719 A New Biologically Inspired Pattern Recognition Spproach for Face Recognition
Authors: V. Kabeer, N.K.Narayanan
Abstract:
This paper reports a new pattern recognition approach for face recognition. The biological model of light receptors - cones and rods in human eyes and the way they are associated with pattern vision in human vision forms the basis of this approach. The functional model is simulated using CWD and WPD. The paper also discusses the experiments performed for face recognition using the features extracted from images in the AT & T face database. Artificial Neural Network and k- Nearest Neighbour classifier algorithms are employed for the recognition purpose. A feature vector is formed for each of the face images in the database and recognition accuracies are computed and compared using the classifiers. Simulation results show that the proposed method outperforms traditional way of feature extraction methods prevailing for pattern recognition in terms of recognition accuracy for face images with pose and illumination variations.
Keywords: Face recognition, Image analysis, Wavelet feature extraction, Pattern recognition, Classifier algorithms
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16777718 Wavelet Entropy Based Algorithm for Fault Detection and Classification in FACTS Compensated Transmission Line
Authors: Amany M. El-Zonkoly, Hussein Desouki
Abstract:
Distance protection of transmission lines including advanced flexible AC transmission system (FACTS) devices has been a very challenging task. FACTS devices of interest in this paper are static synchronous series compensators (SSSC) and unified power flow controller (UPFC). In this paper, a new algorithm is proposed to detect and classify the fault and identify the fault position in a transmission line with respect to a FACTS device placed in the midpoint of the transmission line. Discrete wavelet transformation and wavelet entropy calculations are used to analyze during fault current and voltage signals of the compensated transmission line. The proposed algorithm is very simple and accurate in fault detection and classification. A variety of fault cases and simulation results are introduced to show the effectiveness of such algorithm.
Keywords: Entropy calculation, FACTS, SSSC, UPFC, wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075