Search results for: random forest algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6104

Search results for: random forest algorithm

5294 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 135
5293 Seismic Response Mitigation of Structures Using Base Isolation System Considering Uncertain Parameters

Authors: Rama Debbarma

Abstract:

The present study deals with the performance of Linear base isolation system to mitigate seismic response of structures characterized by random system parameters. This involves optimization of the tuning ratio and damping properties of the base isolation system considering uncertain system parameters. However, the efficiency of base isolator may reduce if it is not tuned to the vibrating mode it is designed to suppress due to unavoidable presence of system parameters uncertainty. With the aid of matrix perturbation theory and first order Taylor series expansion, the total probability concept is used to evaluate the unconditional response of the primary structures considering random system parameters. For this, the conditional second order information of the response quantities are obtained in random vibration framework using state space formulation. Subsequently, the maximum unconditional root mean square displacement of the primary structures is used as the objective function to obtain optimum damping parameters Numerical study is performed to elucidate the effect of parameters uncertainties on the optimization of parameters of linear base isolator and system performance.

Keywords: linear base isolator, earthquake, optimization, uncertain parameters

Procedia PDF Downloads 422
5292 Collocation Method Using Quartic B-Splines for Solving the Modified RLW Equation

Authors: A. A. Soliman

Abstract:

The Modified Regularized Long Wave (MRLW) equation is solved numerically by giving a new algorithm based on collocation method using quartic B-splines at the mid-knot points as element shape. Also, we use the fourth Runge-Kutta method for solving the system of first order ordinary differential equations instead of finite difference method. Our test problems, including the migration and interaction of solitary waves, are used to validate the algorithm which is found to be accurate and efficient. The three invariants of the motion are evaluated to determine the conservation properties of the algorithm. The temporal evaluation of a Maxwellian initial pulse is then studied.

Keywords: collocation method, MRLW equation, Quartic B-splines, solitons

Procedia PDF Downloads 299
5291 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018

Authors: Mário Ernesto Sitoe, Orlando Zacarias

Abstract:

University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.

Keywords: evasion and retention, cross-validation, bagging, stacking

Procedia PDF Downloads 77
5290 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method

Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri

Abstract:

Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.

Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method

Procedia PDF Downloads 495
5289 Performance Comparison of Cooperative Banks in the EU, USA and Canada

Authors: Matěj Kuc

Abstract:

This paper compares different types of profitability measures of cooperative banks from two developed regions: the European Union and the United States of America together with Canada. We created balanced dataset of more than 200 cooperative banks covering 2011-2016 period. We made series of tests and run Random Effects estimation on panel data. We found that American and Canadian cooperatives are more profitable in terms of return on assets (ROA) and return on equity (ROE). There is no significant difference in net interest margin (NIM). Our results show that the North American cooperative banks accommodated better to the current market environment.

Keywords: cooperative banking, panel data, profitability measures, random effects

Procedia PDF Downloads 109
5288 Joint Path and Push Planning among Moveable Obstacles

Authors: Victor Emeli, Akansel Cosgun

Abstract:

This paper explores the navigation among movable obstacles (NAMO) problem and proposes joint path and push planning: which path to take and in what direction the obstacles should be pushed at, given a start and goal position. We present a planning algorithm for selecting a path and the obstacles to be pushed, where a rapidly-exploring random tree (RRT)-based heuristic is employed to calculate a minimal collision path. When it is necessary to apply a pushing force to slide an obstacle out of the way, the planners leverage means-end analysis through a dynamic physics simulation to determine the sequence of linear pushes to clear the necessary space. Simulation experiments show that our approach finds solutions in higher clutter percentages (up to 49%) compared to the straight-line push planner (37%) and RRT without pushing (18%).

Keywords: motion planning, path planning, push planning, robot navigation

Procedia PDF Downloads 160
5287 A Hybrid Distributed Algorithm for Solving Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a distributed hybrid algorithm is proposed for solving the job shop scheduling problem. The suggested method executes different artificial neural networks, heuristics and meta-heuristics simultaneously on more than one machine. The neural networks are used to control the constraints of the problem while the meta-heuristics search the global space and the heuristics are used to prevent the premature convergence. To attain an efficient distributed intelligent method for solving big and distributed job shop scheduling problems, Apache Spark and Hadoop frameworks are used. In the algorithm implementation and design steps, new approaches are applied. Comparison between the proposed algorithm and other efficient algorithms from the literature shows its efficiency, which is able to solve large size problems in short time.

Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, neural network

Procedia PDF Downloads 382
5286 Wait-Optimized Scheduler Algorithm for Efficient Process Scheduling in Computer Systems

Authors: Md Habibur Rahman, Jaeho Kim

Abstract:

Efficient process scheduling is a crucial factor in ensuring optimal system performance and resource utilization in computer systems. While various algorithms have been proposed over the years, there are still limitations to their effectiveness. This paper introduces a new Wait-Optimized Scheduler (WOS) algorithm that aims to minimize process waiting time by dividing them into two layers and considering both process time and waiting time. The WOS algorithm is non-preemptive and prioritizes processes with the shortest WOS. In the first layer, each process runs for a predetermined duration, and any unfinished process is subsequently moved to the second layer, resulting in a decrease in response time. Whenever the first layer is free or the number of processes in the second layer is twice that of the first layer, the algorithm sorts all the processes in the second layer based on their remaining time minus waiting time and sends one process to the first layer to run. This ensures that all processes eventually run, optimizing waiting time. To evaluate the performance of the WOS algorithm, we conducted experiments comparing its performance with traditional scheduling algorithms such as First-Come-First-Serve (FCFS) and Shortest-Job-First (SJF). The results showed that the WOS algorithm outperformed the traditional algorithms in reducing the waiting time of processes, particularly in scenarios with a large number of short tasks with long wait times. Our study highlights the effectiveness of the WOS algorithm in improving process scheduling efficiency in computer systems. By reducing process waiting time, the WOS algorithm can improve system performance and resource utilization. The findings of this study provide valuable insights for researchers and practitioners in developing and implementing efficient process scheduling algorithms.

Keywords: process scheduling, wait-optimized scheduler, response time, non-preemptive, waiting time, traditional scheduling algorithms, first-come-first-serve, shortest-job-first, system performance, resource utilization

Procedia PDF Downloads 82
5285 Comparison of ANFIS Update Methods Using Genetic Algorithm, Particle Swarm Optimization, and Artificial Bee Colony

Authors: Michael R. Phangtriastu, Herriyandi Herriyandi, Diaz D. Santika

Abstract:

This paper presents a comparison of the implementation of metaheuristic algorithms to train the antecedent parameters and consequence parameters in the adaptive network-based fuzzy inference system (ANFIS). The algorithms compared are genetic algorithm (GA), particle swarm optimization (PSO), and artificial bee colony (ABC). The objective of this paper is to benchmark well-known metaheuristic algorithms. The algorithms are applied to several data set with different nature. The combinations of the algorithms' parameters are tested. In all algorithms, a different number of populations are tested. In PSO, combinations of velocity are tested. In ABC, a different number of limit abandonment are tested. Experiments find out that ABC is more reliable than other algorithms, ABC manages to get better mean square error (MSE) than other algorithms in all data set.

Keywords: ANFIS, artificial bee colony, genetic algorithm, metaheuristic algorithm, particle swarm optimization

Procedia PDF Downloads 346
5284 A Sequential Approach for Random-Effects Meta-Analysis

Authors: Samson Henry Dogo, Allan Clark, Elena Kulinskaya

Abstract:

The objective in meta-analysis is to combine results from several independent studies in order to create generalization and provide evidence based for decision making. But recent studies show that the magnitude of effect size estimates reported in many areas of research finding changed with year publication and this can impair the results and conclusions of meta-analysis. A number of sequential methods have been proposed for monitoring the effect size estimates in meta-analysis. However they are based on statistical theory applicable to fixed effect model (FEM). For random-effects model (REM), the analysis incorporates the heterogeneity variance, tau-squared and its estimation create complications. In this paper proposed the use of Gombay and Serbian (2005) truncated CUSUM-type test with asymptotically valid critical values for sequential monitoring of REM. Simulation results show that the test does not control the Type I error well, and is not recommended. Further work required to derive an appropriate test in this important area of application.

Keywords: meta-analysis, random-effects model, sequential test, temporal changes in effect sizes

Procedia PDF Downloads 461
5283 Impacts of Land Use and Land Cover Change on Stream Flow and Sediment Yield of Genale Dawa Dam III Watershed, Ethiopia

Authors: Aklilu Getahun Sulito

Abstract:

Land Use and Land Cover change dynamics is a result of complex interactions betweenseveral bio- physical and socio-economic conditions. The impacts of the landcoverchange on stream flow and sediment yield were analyzed statistically usingthehydrological model, SWAT. Genale Dawa Dam III watershed is highly af ectedbydeforestation, over grazing, and agricultural land expansion. This study was aimedusingSWAT model for the assessment of impacts of land use land cover change on sediment yield, evaluating stream flow on wet &dry seasons and spatial distribution sediment yieldfrom sub-basins of the Genale Dawa Dam III watershed. Land use land cover maps(LULC) of 2000, 2008 and 2016 were used with same corresponding climate data. During the study period most parts of the forest, dense forest evergreen and grass landchanged to cultivated land. The cultivated land increased by 26.2%but forest land, forest evergreen lands and grass lands decreased by 21.33%, 11.59 % and 7.28 %respectively, following that the mean annual sediment yield of watershed increased by 7.37ton/haover16 years period (2000 – 2016). The analysis of stream flow for wet and dry seasonsshowed that the steam flow increased by 25.5% during wet season, but decreasedby29.6% in the dry season. The result an average annual spatial distribution of sediment yield increased by 7.73ton/ha yr -1 from (2000_2016). The calibration results for bothstream flow and sediment yield showed good agreement between observed and simulateddata with the coef icient of determination of 0.87 and 0.84, Nash-Sutclif e ef iciencyequality to 0.83 and 0.78 and percentage bias of -7.39% and -10.90%respectively. Andthe result for validation for both stream flow and sediment showed good result withCoef icient of determination equality to 0.83 and 0.80, Nash-Sutclif e ef iciency of 0.78and 0.75 and percentage bias of 7.09% and 3.95%. The result obtained fromthe model based on the above method was the mean annual sediment load at Genale DawaDamIIIwatershed increase from 2000 to 2016 for the reason that of the land uses change. Sotouse the Genale Dawa Dam III the land use management practices are neededinthefuture to prevent further increase of sediment yield of the watershed.

Keywords: Genale Dawa Dam III watershed, land use land cover change, SWAT, spatial distribution, sediment yield, stream flow

Procedia PDF Downloads 47
5282 An Efficient Strategy for Relay Selection in Multi-Hop Communication

Authors: Jung-In Baik, Seung-Jun Yu, Young-Min Ko, Hyoung-Kyu Song

Abstract:

This paper proposes an efficient relaying algorithm to obtain diversity for improving the reliability of a signal. The algorithm achieves time or space diversity gain by multiple versions of the same signal through two routes. Relays are separated between a source and destination. The routes between the source and destination are set adaptive in order to deal with different channels and noises. The routes consist of one or more relays and the source transmits its signal to the destination through the routes. The signals from the relays are combined and detected at the destination. The proposed algorithm provides a better performance than the conventional algorithms in bit error rate (BER).

Keywords: multi-hop, OFDM, relay, relaying selection

Procedia PDF Downloads 440
5281 An Automated Optimal Robotic Assembly Sequence Planning Using Artificial Bee Colony Algorithm

Authors: Balamurali Gunji, B. B. V. L. Deepak, B. B. Biswal, Amrutha Rout, Golak Bihari Mohanta

Abstract:

Robots play an important role in the operations like pick and place, assembly, spot welding and much more in manufacturing industries. Out of those, assembly is a very important process in manufacturing, where 20% of manufacturing cost is wholly occupied by the assembly process. To do the assembly task effectively, Assembly Sequences Planning (ASP) is required. ASP is one of the multi-objective non-deterministic optimization problems, achieving the optimal assembly sequence involves huge search space and highly complex in nature. Many researchers have followed different algorithms to solve ASP problem, which they have several limitations like the local optimal solution, huge search space, and execution time is more, complexity in applying the algorithm, etc. By keeping the above limitations in mind, in this paper, a new automated optimal robotic assembly sequence planning using Artificial Bee Colony (ABC) Algorithm is proposed. In this algorithm, automatic extraction of assembly predicates is done using Computer Aided Design (CAD) interface instead of extracting the assembly predicates manually. Due to this, the time of extraction of assembly predicates to obtain the feasible assembly sequence is reduced. The fitness evaluation of the obtained feasible sequence is carried out using ABC algorithm to generate the optimal assembly sequence. The proposed methodology is applied to different industrial products and compared the results with past literature.

Keywords: assembly sequence planning, CAD, artificial Bee colony algorithm, assembly predicates

Procedia PDF Downloads 233
5280 Valorization of a Forest Waste, Modified P-Brutia Cones, by Biosorption of Methyl Geen

Authors: Derradji Chebli, Abdallah Bouguettoucha, Abdelbaki Reffas Khalil Guediri, Abdeltif Amrane

Abstract:

The removal of Methyl Green dye (MG) from aqueous solutions using modified P-brutia cones (PBH and PBN), has been investigated work. The physical parameters such as pH, temperature, initial MG concentration, ionic strength are examined in batch experiments on the sorption of the dye. Adsorption removal of MG was conducted at natural pH 4.5 because the dye is only stable in the range of pH 3.8 to 5. It was observed in experiments that the P-brutia cones treated with NaOH (PBN) exhibited high affinity and adsorption capacity compared to the MG P-brutia cones treated with HCl (PBH) and biosorption capacity of modified P-brutia cones (PBN and PBH) was enhanced by increasing the temperature. This is confirmed by the thermodynamic parameters (ΔG° and ΔH°) which show that the adsorption of MG was spontaneous and endothermic in nature. The positive values of ΔS° suggested an irregular increase in the randomness for both adsorbent (PBN and PBH) during the adsorption process. The kinetic model pseudo-first order, pseudo-second order, and intraparticle diffusion coefficient were examined to analyze the sorption process; they showed that the pseudo-second-order model is the one that best describes the adsorption process (MG) on PBN and PBH with a correlation coefficient R²> 0.999. The ionic strength has shown that it has a negative impact on the adsorption of MG on two supports. A reduction of 68.5% of the adsorption capacity for a value Ce=30 mg/L was found for the PBH, while the PBN did not show a significant influence of the ionic strength on adsorption especially in the presence of NaCl. Among the tested isotherm models, the Langmuir isotherm was found to be the most relevant to describe MG sorption onto modified P-brutia cones with a correlation factor R²>0.999. The capacity adsorption of P-brutia cones, was confirmed for the removal of a dye, MG, from aqueous solution. We note also that P-brutia cones is a material very available in the forest and low-cost biomaterial

Keywords: adsorption, p-brutia cones, forest wastes, dyes, isotherm

Procedia PDF Downloads 371
5279 A New Optimization Algorithm for Operation of a Microgrid

Authors: Sirus Mohammadi, Rohala Moghimi

Abstract:

The main advantages of microgrids are high energy efficiency through the application of Combined Heat and Power (CHP), high quality and reliability of the delivered electric energy and environmental and economic advantages. This study presents an energy management system (EMS) to optimize the operation of the microgrid (MG). In this paper an Adaptive Modified Firefly Algorithm (AMFA) is presented for optimal operation of a typical MG with renewable energy sources (RESs) accompanied by a back-up Micro-Turbine/Fuel Cell/Battery hybrid power source to level the power mismatch or to store the energy surplus when it’s needed. The problem is formulated as a nonlinear constraint problem to minimize the total operating cost. The management of Energy storage system (ESS), economic load dispatch and operation optimization of distributed generation (DG) are simplified into a single-object optimization problem in the EMS. The proposed algorithm is tested on a typical grid-connected MG including WT/PV/Micro Turbine/Fuel Cell and Energy Storage Devices (ESDs) then its superior performance is compared with those from other evolutionary algorithms such as Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Fuzzy Self Adaptive PSO (FSAPSO), Chaotic Particle PSO (CPSO), Adaptive Modified PSO (AMPSO), and Firefly Algorithm (FA).

Keywords: microgrid, operation management, optimization, firefly algorithm (AMFA)

Procedia PDF Downloads 335
5278 Discontinuous Spacetime with Vacuum Holes as Explanation for Gravitation, Quantum Mechanics and Teleportation

Authors: Constantin Z. Leshan

Abstract:

Hole Vacuum theory is based on discontinuous spacetime that contains vacuum holes. Vacuum holes can explain gravitation, some laws of quantum mechanics and allow teleportation of matter. All massive bodies emit a flux of holes which curve the spacetime; if we increase the concentration of holes, it leads to length contraction and time dilation because the holes do not have the properties of extension and duration. In the limited case when space consists of holes only, the distance between every two points is equal to zero and time stops - outside of the Universe, the extension and duration properties do not exist. For this reason, the vacuum hole is the only particle in physics capable of describing gravitation using its own properties only. All microscopic particles must 'jump' continually and 'vibrate' due to the appearance of holes (impassable microscopic 'walls' in space), and it is the cause of the quantum behavior. Vacuum holes can explain the entanglement, non-locality, wave properties of matter, tunneling, uncertainty principle and so on. Particles do not have trajectories because spacetime is discontinuous and has impassable microscopic 'walls' due to the simple mechanical motion is impossible at small scale distances; it is impossible to 'trace' a straight line in the discontinuous spacetime because it contains the impassable holes. Spacetime 'boils' continually due to the appearance of the vacuum holes. For teleportation to be possible, we must send a body outside of the Universe by enveloping it with a closed surface consisting of vacuum holes. Since a material body cannot exist outside of the Universe, it reappears instantaneously in a random point of the Universe. Since a body disappears in one volume and reappears in another random volume without traversing the physical space between them, such a transportation method can be called teleportation (or Hole Teleportation). It is shown that Hole Teleportation does not violate causality and special relativity due to its random nature and other properties. Although Hole Teleportation has a random nature, it can be used for colonization of extrasolar planets by the help of the method called 'random jumps': after a large number of random teleportation jumps, there is a probability that the spaceship may appear near a habitable planet. We can create vacuum holes experimentally using the method proposed by Descartes: we must remove a body from the vessel without permitting another body to occupy this volume.

Keywords: border of the Universe, causality violation, perfect isolation, quantum jumps

Procedia PDF Downloads 418
5277 Development of Geo-computational Model for Analysis of Lassa Fever Dynamics and Lassa Fever Outbreak Prediction

Authors: Adekunle Taiwo Adenike, I. K. Ogundoyin

Abstract:

Lassa fever is a neglected tropical virus that has become a significant public health issue in Nigeria, with the country having the greatest burden in Africa. This paper presents a Geo-Computational Model for Analysis and Prediction of Lassa Fever Dynamics and Outbreaks in Nigeria. The model investigates the dynamics of the virus with respect to environmental factors and human populations. It confirms the role of the rodent host in virus transmission and identifies how climate and human population are affected. The proposed methodology is carried out on a Linux operating system using the OSGeoLive virtual machine for geographical computing, which serves as a base for spatial ecology computing. The model design uses Unified Modeling Language (UML), and the performance evaluation uses machine learning algorithms such as random forest, fuzzy logic, and neural networks. The study aims to contribute to the control of Lassa fever, which is achievable through the combined efforts of public health professionals and geocomputational and machine learning tools. The research findings will potentially be more readily accepted and utilized by decision-makers for the attainment of Lassa fever elimination.

Keywords: geo-computational model, lassa fever dynamics, lassa fever, outbreak prediction, nigeria

Procedia PDF Downloads 86
5276 Examining the Role of Tree Species in Absorption of Heavy Metals; Case Study: Abidar Forest Park

Authors: Jahede Tekeykhah, Seyed Mohsen Hossini, Gholamali Jalali

Abstract:

Industrial and traffic activities cause large amounts of heavy metals enter into the atmosphere and the use of plant species can be effective in assessing and reducing air pollution by metals. This study aimed to investigate the adsorption level of heavy metals in leaves of Fraxinus rotundifolia, Robinia, Platanus orientalis, Platycladus orientalis and Pinus eldarica trees in Abidar forest park. For this purpose, samples leaves of the trees were prepared from the contaminated and control areas in each region in 3 stations with 3 replicates in mid-August and finally 90 samples were sent to the laboratory. Then, the concentrations of heavy metals were measured by graphite furnace. To do this, factorial experiment based on a completely randomized design with two factors of location on two levels (contaminated area and control area) and the factor of species on five levels (Fraxinus rotundifolia, Robinia, Platanus orientalis, Platycladus orientalis and Pinus eldarica) with three replications was used. The analysis of collected data was performed by SPSS software and Duncan's multiple range test was used to compare the means. The results showed that the accumulation of all metals in the leaves of most species in the infected area with a significant difference at 95% level was higher than the control area. In the contaminated area, with a significant difference at 5% level, the highest accumulations of metals were observed as the following: lead, cadmium, zinc and manganese in Platanus orientalis, nickel in Fraxinus rotundifolia and copper in Platycladus orientalis.

Keywords: airborne, tree species, heavy metals, absorption, Abidar Forest Park

Procedia PDF Downloads 297
5275 A Fast Parallel and Distributed Type-2 Fuzzy Algorithm Based on Cooperative Mobile Agents Model for High Performance Image Processing

Authors: Fatéma Zahra Benchara, Mohamed Youssfi, Omar Bouattane, Hassan Ouajji, Mohamed Ouadi Bensalah

Abstract:

The aim of this paper is to present a distributed implementation of the Type-2 Fuzzy algorithm in a parallel and distributed computing environment based on mobile agents. The proposed algorithm is assigned to be implemented on a SPMD (Single Program Multiple Data) architecture which is based on cooperative mobile agents as AVPE (Agent Virtual Processing Element) model in order to improve the processing resources needed for performing the big data image segmentation. In this work we focused on the application of this algorithm in order to process the big data MRI (Magnetic Resonance Images) image of size (n x m). It is encapsulated on the Mobile agent team leader in order to be split into (m x n) pixels one per AVPE. Each AVPE perform and exchange the segmentation results and maintain asynchronous communication with their team leader until the convergence of this algorithm. Some interesting experimental results are obtained in terms of accuracy and efficiency analysis of the proposed implementation, thanks to the mobile agents several interesting skills introduced in this distributed computational model.

Keywords: distributed type-2 fuzzy algorithm, image processing, mobile agents, parallel and distributed computing

Procedia PDF Downloads 419
5274 Effect of Human Use, Season and Habitat on Ungulate Densities in Kanha Tiger Reserve

Authors: Neha Awasthi, Ujjwal Kumar

Abstract:

Density of large carnivores is primarily dictated by the density of their prey. Therefore, optimal management of ungulates populations permits harbouring of viable large carnivore populations within protected areas. Ungulate density is likely to respond to regimes of protection and vegetation types. This has generated the need among conservation practitioners to obtain strata specific seasonal species densities for habitat management. Kanha Tiger Reserve (KTR) of 2074 km2 area comprises of two distinct management strata: The core (940 km2), devoid of human settlements and buffer (1134 km2) which is a multiple use area. In general, four habitat strata, grassland, sal forest, bamboo-mixed forest and miscellaneous forest are present in the reserve. Stratified sampling approach was used to access a) impact of human use and b) effect of habitat and season on ungulate densities. Since 2013 to 2016, ungulates were surveyed in winter and summer of each year with an effort of 1200 km walk in 200 spatial transects distributed throughout Kanha Tiger Reserve. We used a single detection function for each species within each habitat stratum for each season for estimating species specific seasonal density, using program DISTANCE. Our key results state that the core area had 4.8 times higher wild ungulate biomass compared with the buffer zone, highlighting the importance of undisturbed area. Chital was found to be most abundant, having a density of 30.1(SE 4.34)/km2 and contributing 33% of the biomass with a habitat preference for grassland. Unlike other ungulates, Gaur being mega herbivore, showed a major seasonal shift in density from bamboo-mixed and sal forest in summer to miscellaneous forest in winter. Maximum diversity and ungulate biomass were supported by grassland followed by bamboo-mixed habitat. Our study stresses the importance of inviolate core areas for achieving high wild ungulate densities and for maintaining populations of endangered and rare species. Grasslands accounts for 9% of the core area of KTR maintained in arrested stage of succession, therefore enhancing this habitat would maintain ungulate diversity, density and cater to the needs of only surviving population of the endangered barasingha and grassland specialist the blackbuck. We show the relevance of different habitat types for differential seasonal use by ungulates and attempt to interpret this in the context of nutrition and cover needs by wild ungulates. Management for an optimal habitat mosaic that maintains ungulate diversity and maximizes ungulate biomass is recommended.

Keywords: distance sampling, habitat management, ungulate biomass, diversity

Procedia PDF Downloads 299
5273 RAPD Analysis of the Genetic Polymorphism in the Collection of Rye Cultivars

Authors: L. Petrovičová, Ž. Balážová, Z. Gálová, M. Wójcik-Jagła, M. Rapacz

Abstract:

In the present study, RAPD-PCR was used to assess genetic diversity of the rye including landrances and new rye cultivars coming from Central Europe and the Union of Soviet Socialist Republics (SUN). Five arbitrary random primers were used to determine RAPD polymorphism in the set of 38 rye genotypes. These primers amplified altogether 43 different DNA fragments with an average number of 8.6 fragments per genotypes. The number of fragments ranged from 7 (RLZ 8, RLZ 9 and RLZ 10) to 12 (RLZ 6). DI and PIC values of all RAPD markers were higher than 0.8 that generally means high level of polymorphism detected between rye genotypes. The dendrogram based on hierarchical cluster analysis using UPGMA algorithm was prepared. The cultivars were grouped into two main clusters. In this experiment, RAPD proved to be a rapid, reliable and practicable method for revealing of polymorphism in the rye cultivars.

Keywords: genetic diversity, polymorphism, RAPD markers, Secale cereale L.

Procedia PDF Downloads 437
5272 Novel Algorithm for Restoration of Retina Images

Authors: P. Subbuthai, S. Muruganand

Abstract:

Diabetic Retinopathy is one of the complicated diseases and it is caused by the changes in the blood vessels of the retina. Extraction of retina image through Fundus camera sometimes produced poor contrast and noises. Because of this noise, detection of blood vessels in the retina is very complicated. So preprocessing is needed, in this paper, a novel algorithm is implemented to remove the noisy pixel in the retina image. The proposed algorithm is Extended Median Filter and it is applied to the green channel of the retina because green channel vessels are brighter than the background. Proposed extended median filter is compared with the existing standard median filter by performance metrics such as PSNR, MSE and RMSE. Experimental results show that the proposed Extended Median Filter algorithm gives a better result than the existing standard median filter in terms of noise suppression and detail preservation.

Keywords: fundus retina image, diabetic retinopathy, median filter, microaneurysms, exudates

Procedia PDF Downloads 334
5271 A Research and Application of Feature Selection Based on IWO and Tabu Search

Authors: Laicheng Cao, Xiangqian Su, Youxiao Wu

Abstract:

Feature selection is one of the important problems in network security, pattern recognition, data mining and other fields. In order to remove redundant features, effectively improve the detection speed of intrusion detection system, proposes a new feature selection method, which is based on the invasive weed optimization (IWO) algorithm and tabu search algorithm(TS). Use IWO as a global search, tabu search algorithm for local search, to improve the results of IWO algorithm. The experimental results show that the feature selection method can effectively remove the redundant features of network data information in feature selection, reduction time, and to guarantee accurate detection rate, effectively improve the speed of detection system.

Keywords: intrusion detection, feature selection, iwo, tabu search

Procedia PDF Downloads 520
5270 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification

Authors: Rujia Chen, Ajit Narayanan

Abstract:

Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.

Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels

Procedia PDF Downloads 180
5269 A Preliminary Survey on Butterfly Fauna at Rajagala Archaeological Site, Ampara, Sri Lanka

Authors: D. Eranda N. Mandawala, P. A. D. Mokshi V. Perera

Abstract:

The RajagalaArchaeological site (RAS) is located 26 km from Ampara town (7º29'25.22" N, 81º36'59.05" E) accessible through the Ampara-Uhana-MahaOya highway of the Eastern province of Sri Lanka. This site has recently been added to the tentative list of UNESCO world heritage site and is also a forest reserve. This dry zone forest consists of tropical mixed evergreen vegetation and scrublands on a rocky outcrop of elevation of about 350 meters above mean sea level. It is also scattered with several ponds of differing sizes on rocky outcrops, rocky cliffs, and about 50 cave dwellings. No comprehensive biodiversity survey of any sorts has been conducted at the RAS so far. Therefore, a preliminary survey was conducted to determine its butterfly fauna diversity. An opportunistic Visual Encounter Survey method was used to observe various butterfly species during the morning between 8:00am-12:00noon and in the evening between 2:00-6:00pm on 3 site visits in October 2017, February 2018, and November 2019. All encountered species were photographed using a Nikon D750 camera with Sigma 105mm f/2.8 EX DG OS HSM macro lens, and field guide books were used to identify them. Sri Lanka is home to 248 species of butterflies, of which are 26 are endemic. At RAS, we observed a total of 39 species (15%) of butterflies belonging to 5 Lepidoptera families. Out of these, one endemic species(4%) and 9 endemic subspecieswere also identified. The former was Troidesdarsius, also known as the Sri Lanka birdwing which is the national butterfly and the largest butterfly in Sri Lanka, and the latter were Plains cupid (Chiladespandavalanka), Yamfly (Loxuraatymnus arcuate), Common Cerulean (Jamidescelenotissama), Tawny Rajah(Charaxespsaphonpsaphon), Tamil Yeoman(Cirrochroathaislanka), Angled Castor(Ariadne ariadneminorata), GladeyeBushbrown(Mycalesispatnia patina), Common Crow (Euploea core asela)and Blue Mormon (Papiliopolymnestorparinda). The endemic subspecies belonged to 3 Lepidoptera families (3from Lycaenidae, 5 from Nymphalidae, and 1 from Papilionidae family). Anthropogenic activities such as unauthorized cattle farming, forest clearance, and man-made forest fires currently threaten this site. If such trends continue, it may lead to the reduction of butterfly fauna diversity within this area in the future.

Keywords: lepidoptera, rajagala, Sri Lanka birdwing, endemic

Procedia PDF Downloads 153
5268 Artificial Intelligence Approach to Manage Human Resources Information System Process in the Construction Industry

Authors: Ahmed Emad Ahmed

Abstract:

This paper aims to address the concept of human resources information systems (HRIS) and how to link it to new technologies such as artificial intelligence (AI) to be implemented in two human resources processes. A literature view has been collected to cover the main points related to HRIS, AI, and BC. A study case has been presented by generating a random HRIS to apply some AI operations to it. Then, an algorithm was applied to the database to complete some human resources processes, including training and performance appraisal, using a pre-trained AI model. After that, outputs and results have been presented and discussed briefly. Finally, a conclusion has been introduced to show the ability of new technologies such as AI and ML to be applied to the human resources management processes.

Keywords: human resources new technologies, HR artificial intelligence, HRIS AI models, construction AI HRIS

Procedia PDF Downloads 163
5267 An Accurate Method for Phylogeny Tree Reconstruction Based on a Modified Wild Dog Algorithm

Authors: Essam Al Daoud

Abstract:

This study solves a phylogeny problem by using modified wild dog pack optimization. The least squares error is considered as a cost function that needs to be minimized. Therefore, in each iteration, new distance matrices based on the constructed trees are calculated and used to select the alpha dog. To test the suggested algorithm, ten homologous genes are selected and collected from National Center for Biotechnology Information (NCBI) databanks (i.e., 16S, 18S, 28S, Cox 1, ITS1, ITS2, ETS, ATPB, Hsp90, and STN). The data are divided into three categories: 50 taxa, 100 taxa and 500 taxa. The empirical results show that the proposed algorithm is more reliable and accurate than other implemented methods.

Keywords: least square, neighbor joining, phylogenetic tree, wild dog pack

Procedia PDF Downloads 315
5266 Identification of Soft Faults in Branched Wire Networks by Distributed Reflectometry and Multi-Objective Genetic Algorithm

Authors: Soumaya Sallem, Marc Olivas

Abstract:

This contribution presents a method for detecting, locating, and characterizing soft faults in a complex wired network. The proposed method is based on multi-carrier reflectometry MCTDR (Multi-Carrier Time Domain Reflectometry) combined with a multi-objective genetic algorithm. In order to ensure complete network coverage and eliminate diagnosis ambiguities, the MCTDR test signal is injected at several points on the network, and the data is merged between different reflectometers (sensors) distributed on the network. An adapted multi-objective genetic algorithm is used to merge data in order to obtain more accurate faults location and characterization. The proposed method performances are evaluated from numerical and experimental results.

Keywords: wired network, reflectometry, network distributed diagnosis, multi-objective genetic algorithm

Procedia PDF Downloads 187
5265 Estimating Air Particulate Matter 10 Using Satellite Data and Analyzing Its Annual Temporal Pattern over Gaza Strip, Palestine

Authors: ِAbdallah A. A. Shaheen

Abstract:

Gaza Strip faces economic and political issues such as conflict, siege and urbanization; all these have led to an increase in the air pollution over Gaza Strip. In this study, Particulate matter 10 (PM10) concentration over Gaza Strip has been estimated by Landsat Thematic Mapper (TM) and Landsat Enhanced Thematic Mapper Plus (ETM+) data, based on a multispectral algorithm. Simultaneously, in-situ measurements for the corresponding particulate are acquired for selected time period. Landsat and ground data for eleven years are used to develop the algorithm while four years data (2002, 2006, 2010 and 2014) have been used to validate the results of algorithm. The developed algorithm gives highest regression, R coefficient value i.e. 0.86; RMSE value as 9.71 µg/m³; P values as 0. Average validation of algorithm show that calculated PM10 strongly correlates with measured PM10, indicating high efficiency of algorithm for the mapping of PM10 concentration during the years 2000 to 2014. Overall results show increase in minimum, maximum and average yearly PM10 concentrations, also presents similar trend over urban area. The rate of urbanization has been evaluated by supervised classification of the Landsat image. Urban sprawl from year 2000 to 2014 results in a high concentration of PM10 in the study area.

Keywords: PM10, landsat, atmospheric reflectance, Gaza strip, urbanization

Procedia PDF Downloads 244