Search results for: efficient crow search algorithm
8590 Use of Indigenous Knowledge System (IKS) by Farmers for Selected Arable Crops Production in Ondo State
Authors: A. M. Omoare, E. O. Fakoya
Abstract:
This study sought to determine the use of indigenous knowledge for selected arable crops production in Ondo Sate. A multistage sampling method was used and 112 arable crops farmers were systematically selected. Data were analyzed using both descriptive and inferential statistics. The results showed that majority of the sampled farmers were male (75.90%). About 75% were married with children. Large proportion of them (62.61%) were within the ages of 30-49 years. Most of them have spent about 10 years in farming (58.92%). The highest raw scores of use of indigenous knowledge were found in planting on mound in yam production, use of native medicine and scare crow method in controlling birds in rice production, timely planting of locally developed resistant varieties in cassava production and soaking of maize seeds in water to determine their viability with raw scores of 313, 310, 305, 303, and 300 respectively, while the lowest raw scores was obtained in use of bell method in controlling birds in rice production with raw scores of 210. The findings established that proverbs (59.8%) and taboos (55.36%) were the most commonly used media in transmitting indigenous knowledge by arable crop farmers. The multiple regression analysis result revealed that age of the farmers and farming experience had a significant relationship with the use of indigenous knowledge of the farmers which gave R2 = 0.83 for semi log function form of equation which is the land equation. The policy implication is that indigenous knowledge should provide basis for designing modern technologies to enhance sustainable agricultural development.Keywords: crop production, extent of use, indigenous knowledge, arable crops
Procedia PDF Downloads 6598589 Genetic Algorithm and Multi-Parametric Programming Based Cascade Control System for Unmanned Aerial Vehicles
Authors: Dao Phuong Nam, Do Trong Tan, Pham Tam Thanh, Le Duy Tung, Tran Hoang Anh
Abstract:
This paper considers the problem of cascade control system for unmanned aerial vehicles (UAVs). Due to the complicated modelling technique of UAV, it is necessary to separate them into two subsystems. The proposed cascade control structure is a hierarchical scheme including a robust control for inner subsystem based on H infinity theory and trajectory generator using genetic algorithm (GA), outer loop control law based on multi-parametric programming (MPP) technique to overcome the disadvantage of a big amount of calculations. Simulation results are presented to show that the equivalent path has been found and obtained by proposed cascade control scheme.Keywords: genetic algorithm, GA, H infinity, multi-parametric programming, MPP, unmanned aerial vehicles, UAVs
Procedia PDF Downloads 2128588 Application of Modified Vermiculite for Cationic Textile Dyestuffs Removal: Sorption and Regeneration Studies
Authors: W. Stawiński, A. Wegrzyn, O. M. Freitas, S. A. Figueiredo
Abstract:
Water is a life supporting resource, crucial for humanity and essential for natural ecosystems, which have been endangered by developing industry and increasing human population. Dyes are common in effluents discharged by various industries such as paper, plastics, food, cosmetics, and textile. They produce toxic effects on animals and disturb natural biological processes in receiving waters. Having complex molecular structure and resistance to biological decomposition they are problematic and difficult to be treated by conventional methods. In the search of efficient and sustainable method, sorption has been getting more interest in application to wastewaters treatment. Clays are minerals that have a layer structure based on phyllosilicate sheets that may carry a charge, which is balanced by ions located between the sheets. These charge-balancing ions can be exchanged resulting in very good ion-exchange properties of the material. Modifications of clays enhance their properties, producing a good and inexpensive sorbent for the removal of pollutants from wastewaters. The presented work proves that the treatment of a clay, vermiculite, with nitric acid followed by washing in citric acid strongly increases the sorption of two cationic dyes, methylene blue (C.I. 52015) and astrazon red (C.I. 110825). Desorption studies showed that the best eluent for regeneration is a solution of NaCl in ethanol. Cycles of sorption and desorption in column system showed no significant deterioration of sorption capacity and proved that the material shows a very good performance as sorbent, which can be recycled and reused. The results obtained open new possibilities of further modifications on vermiculite and modifications of other materials in order to get very efficient sorbents useful for wastewater treatment.Keywords: cationic dyestuffs, sorption and regeneration, vermiculite, wastewater treatment
Procedia PDF Downloads 2628587 Adaptive Online Object Tracking via Positive and Negative Models Matching
Authors: Shaomei Li, Yawen Wang, Chao Gao
Abstract:
To improve tracking drift which often occurs in adaptive tracking, an algorithm based on the fusion of tracking and detection is proposed in this paper. Firstly, object tracking is posed as a binary classification problem and is modeled by partial least squares (PLS) analysis. Secondly, tracking object frame by frame via particle filtering. Thirdly, validating the tracking reliability based on both positive and negative models matching. Finally, relocating the object based on SIFT features matching and voting when drift occurs. Object appearance model is updated at the same time. The algorithm cannot only sense tracking drift but also relocate the object whenever needed. Experimental results demonstrate that this algorithm outperforms state-of-the-art algorithms on many challenging sequences.Keywords: object tracking, tracking drift, partial least squares analysis, positive and negative models matching
Procedia PDF Downloads 5298586 iCount: An Automated Swine Detection and Production Monitoring System Based on Sobel Filter and Ellipse Fitting Model
Authors: Jocelyn B. Barbosa, Angeli L. Magbaril, Mariel T. Sabanal, John Paul T. Galario, Mikka P. Baldovino
Abstract:
The use of technology has become ubiquitous in different areas of business today. With the advent of digital imaging and database technology, business owners have been motivated to integrate technology to their business operation ranging from small, medium to large enterprises. Technology has been found to have brought many benefits that can make a business grow. Hog or swine raising, for example, is a very popular enterprise in the Philippines, whose challenges in production monitoring can be addressed through technology integration. Swine production monitoring can become a tedious task as the enterprise goes larger. Specifically, problems like delayed and inconsistent reports are most likely to happen if counting of swine per pen of which building is done manually. In this study, we present iCount, which aims to ensure efficient swine detection and counting that hastens the swine production monitoring task. We develop a system that automatically detects and counts swine based on Sobel filter and ellipse fitting model, given the still photos of the group of swine captured in a pen. We improve the Sobel filter detection result through 8-neigbhorhood rule implementation. Ellipse fitting technique is then employed for proper swine detection. Furthermore, the system can generate periodic production reports and can identify the specific consumables to be served to the swine according to schedules. Experiments reveal that our algorithm provides an efficient way for detecting swine, thereby providing a significant amount of accuracy in production monitoring.Keywords: automatic swine counting, swine detection, swine production monitoring, ellipse fitting model, sobel filter
Procedia PDF Downloads 3118585 Islamophobia, Years After 9/11: An Assessment of the American Media
Authors: Nasa'i Muhammad Gwadabe
Abstract:
This study seeks to find the extent to which the old Islamophobic prejudice was tilted towards a more negative direction in the United States following the 9/11 terrorist attacks. It is hypothesized that, the 9/11 attacks in the United States reshaped the old Islamophobic prejudice through the reinforcement of a strong social identity construction of Muslims as “out-group”. The “social identity” and “discourse representation” theories are used as framework for analysis. To test the hypothesis, two categories were created: the prejudice (out-group) and the tolerance (in-group) categories. The Prejudice (out-group) against Muslims category was coded to include six attributes: (Terrorist, Threat, Women's Rights violation, Undemocratic, Backward and Intolerant); while the tolerance (In-group) for Muslims category was also coded to include six attributes: (Peaceful, civilized, educated, partners trustworthy and honest). Data are generated from the archives of three American newspapers: The Los Angeles Times, New York Times and USA Today using specific search terms and specific date range; from 9/11/1996 to 9/11/2006, that is five years before and five years after the 9/11. An aggregate of 20595 articles were generated from the search of the three newspapers throughout the search periods. Conclusively, for both pre and post 9/11 periods, the articles generated under the category of prejudice (out-group) against Muslims revealed a higher frequency, against that of tolerance (in-group) for them, which is lesser. Finally, The comparison between the pre and post 9/11 periods showed that, the increased Prejudice (out-group) against Muslims was most influenced through libeling them as terrorist, which signaled a skyrocketed increase from pre to post 9/11.Keywords: in-group, Islam, Islamophobia, Muslims, out-group, prejudice, terrorism, the 9/11 and tolerance
Procedia PDF Downloads 3058584 An Improved K-Means Algorithm for Gene Expression Data Clustering
Authors: Billel Kenidra, Mohamed Benmohammed
Abstract:
Data mining technique used in the field of clustering is a subject of active research and assists in biological pattern recognition and extraction of new knowledge from raw data. Clustering means the act of partitioning an unlabeled dataset into groups of similar objects. Each group, called a cluster, consists of objects that are similar between themselves and dissimilar to objects of other groups. Several clustering methods are based on partitional clustering. This category attempts to directly decompose the dataset into a set of disjoint clusters leading to an integer number of clusters that optimizes a given criterion function. The criterion function may emphasize a local or a global structure of the data, and its optimization is an iterative relocation procedure. The K-Means algorithm is one of the most widely used partitional clustering techniques. Since K-Means is extremely sensitive to the initial choice of centers and a poor choice of centers may lead to a local optimum that is quite inferior to the global optimum, we propose a strategy to initiate K-Means centers. The improved K-Means algorithm is compared with the original K-Means, and the results prove how the efficiency has been significantly improved.Keywords: microarray data mining, biological pattern recognition, partitional clustering, k-means algorithm, centroid initialization
Procedia PDF Downloads 1908583 The Algorithm of Semi-Automatic Thai Spoonerism Words for Bi-Syllable
Authors: Nutthapat Kaewrattanapat, Wannarat Bunchongkien
Abstract:
The purposes of this research are to study and develop the algorithm of Thai spoonerism words by semi-automatic computer programs, that is to say, in part of data input, syllables are already separated and in part of spoonerism, the developed algorithm is utilized, which can establish rules and mechanisms in Thai spoonerism words for bi-syllables by utilizing analysis in elements of the syllables, namely cluster consonant, vowel, intonation mark and final consonant. From the study, it is found that bi-syllable Thai spoonerism has 1 case of spoonerism mechanism, namely transposition in value of vowel, intonation mark and consonant of both 2 syllables but keeping consonant value and cluster word (if any). From the study, the rules and mechanisms in Thai spoonerism word were applied to develop as Thai spoonerism word software, utilizing PHP program. the software was brought to conduct a performance test on software execution; it is found that the program performs bi-syllable Thai spoonerism correctly or 99% of all words used in the test and found faults on the program at 1% as the words obtained from spoonerism may not be spelling in conformity with Thai grammar and the answer in Thai spoonerism could be more than 1 answer.Keywords: algorithm, spoonerism, computational linguistics, Thai spoonerism
Procedia PDF Downloads 2368582 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms
Authors: Francisco M. Silva
Abstract:
Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare
Procedia PDF Downloads 1268581 Efficient Synthesis of Calix[4]Pyrroles Catalyzed by Powerful and Magnetically Recoverable Fe3O4 Nanoparticles
Authors: Renu Gautam, S. M. S. Chauhan
Abstract:
The magnetic Fe3O4 nanoparticles has been used as an efficient and facile acid catalyst for the synthesis of calix[4]pyrrole in moderate to excellent yields by the one pot condensation of different ketones and pyrrole. The catalyst was easily recovered using external magnet and reused over several cycles without losing its catalytic activity.Keywords: calix[4]pyrrole, magnetic, Fe3O4 nanoparticles, catalysis
Procedia PDF Downloads 4378580 Vision-Based Hand Segmentation Techniques for Human-Computer Interaction
Abstract:
This work is the part of vision based hand gesture recognition system for Natural Human Computer Interface. Hand tracking and segmentation are the primary steps for any hand gesture recognition system. The aim of this paper is to develop robust and efficient hand segmentation algorithm such as an input to another system which attempt to bring the HCI performance nearby the human-human interaction, by modeling an intelligent sign language recognition system based on prediction in the context of dialogue between the system (avatar) and the interlocutor. For the purpose of hand segmentation, an overcoming occlusion approach has been proposed for superior results for detection of hand from an image.Keywords: HCI, sign language recognition, object tracking, hand segmentation
Procedia PDF Downloads 4128579 Production Optimization under Geological Uncertainty Using Distance-Based Clustering
Authors: Byeongcheol Kang, Junyi Kim, Hyungsik Jung, Hyungjun Yang, Jaewoo An, Jonggeun Choe
Abstract:
It is important to figure out reservoir properties for better production management. Due to the limited information, there are geological uncertainties on very heterogeneous or channel reservoir. One of the solutions is to generate multiple equi-probable realizations using geostatistical methods. However, some models have wrong properties, which need to be excluded for simulation efficiency and reliability. We propose a novel method of model selection scheme, based on distance-based clustering for reliable application of production optimization algorithm. Distance is defined as a degree of dissimilarity between the data. We calculate Hausdorff distance to classify the models based on their similarity. Hausdorff distance is useful for shape matching of the reservoir models. We use multi-dimensional scaling (MDS) to describe the models on two dimensional space and group them by K-means clustering. Rather than simulating all models, we choose one representative model from each cluster and find out the best model, which has the similar production rates with the true values. From the process, we can select good reservoir models near the best model with high confidence. We make 100 channel reservoir models using single normal equation simulation (SNESIM). Since oil and gas prefer to flow through the sand facies, it is critical to characterize pattern and connectivity of the channels in the reservoir. After calculating Hausdorff distances and projecting the models by MDS, we can see that the models assemble depending on their channel patterns. These channel distributions affect operation controls of each production well so that the model selection scheme improves management optimization process. We use one of useful global search algorithms, particle swarm optimization (PSO), for our production optimization. PSO is good to find global optimum of objective function, but it takes too much time due to its usage of many particles and iterations. In addition, if we use multiple reservoir models, the simulation time for PSO will be soared. By using the proposed method, we can select good and reliable models that already matches production data. Considering geological uncertainty of the reservoir, we can get well-optimized production controls for maximum net present value. The proposed method shows one of novel solutions to select good cases among the various probabilities. The model selection schemes can be applied to not only production optimization but also history matching or other ensemble-based methods for efficient simulations.Keywords: distance-based clustering, geological uncertainty, particle swarm optimization (PSO), production optimization
Procedia PDF Downloads 1438578 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data
Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou
Abstract:
In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution
Procedia PDF Downloads 1088577 Comparison Between Genetic Algorithms and Particle Swarm Optimization Optimized Proportional Integral Derirative and PSS for Single Machine Infinite System
Authors: Benalia Nadia, Zerzouri Nora, Ben Si Ali Nadia
Abstract:
Abstract: Among the many different modern heuristic optimization methods, genetic algorithms (GA) and the particle swarm optimization (PSO) technique have been attracting a lot of interest. The GA has gained popularity in academia and business mostly because to its simplicity, ability to solve highly nonlinear mixed integer optimization problems that are typical of complex engineering systems, and intuitiveness. The mechanics of the PSO methodology, a relatively recent heuristic search tool, are modeled after the swarming or cooperative behavior of biological groups. It is suitable to compare the performance of the two techniques since they both aim to solve a particular objective function but make use of distinct computing methods. In this article, PSO and GA optimization approaches are used for the parameter tuning of the power system stabilizer and Proportional integral derivative regulator. Load angle and rotor speed variations in the single machine infinite bus bar system is used to measure the performance of the suggested solution.Keywords: SMIB, genetic algorithm, PSO, transient stability, power system stabilizer, PID
Procedia PDF Downloads 838576 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm
Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar
Abstract:
The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations
Procedia PDF Downloads 4158575 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding
Authors: Seongsoo Lee
Abstract:
Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization
Procedia PDF Downloads 3638574 A Fast Version of the Generalized Multi-Directional Radon Transform
Authors: Ines Elouedi, Atef Hammouda
Abstract:
This paper presents a new fast version of the generalized Multi-Directional Radon Transform method. The new method uses the inverse Fast Fourier Transform to lead to a faster Generalized Radon projections. We prove in this paper that the fast algorithm leads to almost the same results of the eldest one but with a considerable lower time computation cost. The projection end result of the fast method is a parameterized Radon space where a high valued pixel allows the detection of a curve from the original image. The proposed fast inversion algorithm leads to an exact reconstruction of the initial image from the Radon space. We show examples of the impact of this algorithm on the pattern recognition domain.Keywords: fast generalized multi-directional Radon transform, curve, exact reconstruction, pattern recognition
Procedia PDF Downloads 2788573 Reducing Total Harmonic Content of 9-Level Inverter by Use of Cuckoo Algorithm
Authors: Mahmoud Enayati, Sirous Mohammadi
Abstract:
In this paper, a novel procedure to find the firing angles of the multilevel inverters of supply voltage and, consequently, to decline the total harmonic distortion (THD), has been presented. In order to eliminate more harmonics in the multilevel inverters, its number of levels can be lessened or pulse width modulation waveform, in which more than one switching occur in each level, be used. Both cases complicate the non-algebraic equations and their solution cannot be performed by the conventional methods for the numerical solution of nonlinear equations such as Newton-Raphson method. In this paper, Cuckoo algorithm is used to compute the optimal firing angle of the pulse width modulation voltage waveform in the multilevel inverter. These angles should be calculated in such a way that the voltage amplitude of the fundamental frequency be generated while the total harmonic distortion of the output voltage be small. The simulation and theoretical results for the 9-levels inverter offer the high applicability of the proposed algorithm to identify the suitable firing angles for declining the low order harmonics and generate a waveform whose total harmonic distortion is very small and it is almost a sinusoidal waveform.Keywords: evolutionary algorithms, multilevel inverters, total harmonic content, Cuckoo Algorithm
Procedia PDF Downloads 5328572 Overcoming the Problems Affecting Drip Irrigation System through the Design of an Efficient Filtration and Flushing System
Authors: Stephen A. Akinlabi, Esther T. Akinlabi
Abstract:
The drip irrigation system is one of the important areas that affect the livelihood of farmers directly. The use of drip irrigation system has been the most efficient system compared to the other types of irrigations systems because the drip irrigation helps to save water and increase the productivity of crops. But like any other system, it can be considered inefficient when the filters and the emitters get clogged while in operation. The efficiency of the entire system is reduced when the emitters are clogged and blocked. This consequently impact and affect the farm operations which may result in scarcity of farm products and increase the demand. This design work focuses on how to overcome some of the challenges affecting drip irrigation system through the design of an efficient filtration and flushing system.Keywords: drip irrigation system, filters, soil texture, mechanical engineering design, analysis
Procedia PDF Downloads 3838571 A Case Study of Bee Algorithm for Ready Mixed Concrete Problem
Authors: Wuthichai Wongthatsanekorn, Nuntana Matheekrieangkrai
Abstract:
This research proposes Bee Algorithm (BA) to optimize Ready Mixed Concrete (RMC) truck scheduling problem from single batch plant to multiple construction sites. This problem is considered as an NP-hard constrained combinatorial optimization problem. This paper provides the details of the RMC dispatching process and its related constraints. BA was then developed to minimize total waiting time of RMC trucks while satisfying all constraints. The performance of BA is then evaluated on two benchmark problems (3 and 5construction sites) according to previous researchers. The simulation results of BA are compared in term of efficiency and accuracy with Genetic Algorithm (GA) and all problems show that BA approach outperforms GA in term of efficiency and accuracy to obtain optimal solution. Hence, BA approach could be practically implemented to obtain the best schedule.Keywords: bee colony optimization, ready mixed concrete problem, ruck scheduling, multiple construction sites
Procedia PDF Downloads 3858570 GA3C for Anomalous Radiation Source Detection
Authors: Chia-Yi Liu, Bo-Bin Xiao, Wen-Bin Lin, Hsiang-Ning Wu, Liang-Hsun Huang
Abstract:
In order to reduce the risk of radiation damage that personnel may suffer during operations in the radiation environment, the use of automated guided vehicles to assist or replace on-site personnel in the radiation environment has become a key technology and has become an important trend. In this paper, we demonstrate our proof of concept for autonomous self-learning radiation source searcher in an unknown environment without a map. The research uses GPU version of Asynchronous Advantage Actor-Critic network (GA3C) of deep reinforcement learning to search for radiation sources. The searcher network, based on GA3C architecture, has self-directed learned and improved how search the anomalous radiation source by training 1 million episodes under three simulation environments. In each episode of training, the radiation source position, the radiation source intensity, starting position, are all set randomly in one simulation environment. The input for searcher network is the fused data from a 2D laser scanner and a RGB-D camera as well as the value of the radiation detector. The output actions are the linear and angular velocities. The searcher network is trained in a simulation environment to accelerate the learning process. The well-performance searcher network is deployed to the real unmanned vehicle, Dashgo E2, which mounts LIDAR of YDLIDAR G4, RGB-D camera of Intel D455, and radiation detector made by Institute of Nuclear Energy Research. In the field experiment, the unmanned vehicle is enable to search out the radiation source of the 18.5MBq Na-22 by itself and avoid obstacles simultaneously without human interference.Keywords: deep reinforcement learning, GA3C, source searching, source detection
Procedia PDF Downloads 1148569 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory
Authors: Damir Latypov
Abstract:
A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory
Procedia PDF Downloads 1548568 Farmers’ Use of Indigenous Knowledge System (IKS) for Selected Arable Crops Production in Ondo State
Authors: A. M. Omoare, E. O. Fakoya
Abstract:
This study sought to determine the use of indigenous knowledge for selected arable crops production in Ondo Sate. A multistage sampling method was used and 112 arable crops farmers were systematically selected. Data were analyzed using both descriptive and inferential statistics. The results showed that majority of the sampled farmers were male (75.90%) About 75% were married with children. Large proportion of them (62.61%) were within the ages of 30-49 years. Most of them have spent about 10 years in farming (58.92%). The highest raw scores of use of indigenous knowledge were found in planting on mound in yam production, use of native medicine and scare-crow method in controlling birds in rice production, timely planting of locally developed resistant varieties in cassava production and soaking of maize seeds in water to determine their viability with raw scores of 313, 310, 305, 303, and 300 respectively, while the lowest raw scores was obtained in use of bell method in controlling birds in rice production with raw scores of 210. The findings established that proverbs (59.8%) and taboos (55.36%) were the most commonly used media in transmitting indigenous knowledge by arable crop farmers. The multiple regression analysis result revealed that age of the farmers and farming experience had a significant relationship with the use of indigenous knowledge of the farmers which gave R2=0.83 for semi-log function form of equation which is the land equation. The policy implication is that indigenous knowledge should provide a basis for designing modern technologies to enhance sustainable agricultural development.Keywords: Arable Crop Production, extent of use, indigenous knowledge, farming experience
Procedia PDF Downloads 5718567 Energy Efficient Alternate Hydraulic System Called TejHydroLift
Authors: Tejinder Singh
Abstract:
This paper describes a new more efficient Hydraulic System which uses lesser work to produce more output. Conventional Hydraulic System like Hydraulic Lifts and Rams use lots of water to be pumped to produce output. TejHydroLift will do the equal amount of force with lesser input of water. The paper will show that force applied can be increased manifold without requiring to move smaller force by more distance which used to be required in Conventional Hydraulic Lifts. The paper describes one of the configurations of TejHydroLift System called “Slim Antenna TejHydroLift Configuration”. The TejHydroLift uses lesser water and hence demands lesser work to be performed to move the same load.Keywords: alternate, hydraulic system, efficient, TejHydroLift
Procedia PDF Downloads 2608566 Introduction to Multi-Agent Deep Deterministic Policy Gradient
Authors: Xu Jie
Abstract:
As a key network security method, cryptographic services must fully cope with problems such as the wide variety of cryptographic algorithms, high concurrency requirements, random job crossovers, and instantaneous surges in workloads. Its complexity and dynamics also make it difficult for traditional static security policies to cope with the ever-changing situation. Cyber Threats and Environment. Traditional resource scheduling algorithms are inadequate when facing complex decisionmaking problems in dynamic environments. A network cryptographic resource allocation algorithm based on reinforcement learning is proposed, aiming to optimize task energy consumption, migration cost, and fitness of differentiated services (including user, data, and task security). By modeling the multi-job collaborative cryptographic service scheduling problem as a multiobjective optimized job flow scheduling problem, and using a multi-agent reinforcement learning method, efficient scheduling and optimal configuration of cryptographic service resources are achieved. By introducing reinforcement learning, resource allocation strategies can be adjusted in real time in a dynamic environment, improving resource utilization and achieving load balancing. Experimental results show that this algorithm has significant advantages in path planning length, system delay and network load balancing, and effectively solves the problem of complex resource scheduling in cryptographic services.Keywords: multi-agent reinforcement learning, non-stationary dynamics, multi-agent systems, cooperative and competitive agents
Procedia PDF Downloads 238565 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm
Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh
Abstract:
this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.Keywords: genetic algorithm, information retrieval, optimal queries, crossover
Procedia PDF Downloads 2928564 Application of the Discrete Rationalized Haar Transform to Distributed Parameter System
Authors: Joon-Hoon Park
Abstract:
In this paper the rationalized Haar transform is applied for distributed parameter system identification and estimation. A distributed parameter system is a dynamical and mathematical model described by a partial differential equation. And system identification concerns the problem of determining mathematical models from observed data. The Haar function has some disadvantages of calculation because it contains irrational numbers, for these reasons the rationalized Haar function that has only rational numbers. The algorithm adopted in this paper is based on the transform and operational matrix of the rationalized Haar function. This approach provides more convenient and efficient computational results.Keywords: distributed parameter system, rationalized Haar transform, operational matrix, system identification
Procedia PDF Downloads 5098563 The Characteristics of a Fair and Efficient Tax Auditing Information System as a Tool against Tax Evasion: A Theoretical Framework
Authors: Dimitris Balios, Stefanos Tantos
Abstract:
Economic growth and social evolution are connected to trust relationships in a society. The quality of the accounting information, the tax information system and the tax audit mechanism evolve multiple benefits in an economy. Tax evasion, the illegal practice where people and companies do not pay taxes, is a crime because of the negative effect in economy and society. In this paper, we describe a theoretical framework on the characteristics of a fair and efficient tax auditing information system which could be a tool against tax evasion, a tool for an economy to grow, especially in countries that face fluctuations in economic activity. We conclude that a fair and efficient tax auditing information system increases the reliability of tax administration, improves taxpayers’ tax compliance and causes a developmental trajectory for the economy.Keywords: auditing information system, auditing mechanism, tax evasion, taxation
Procedia PDF Downloads 1548562 Classification Rule Discovery by Using Parallel Ant Colony Optimization
Authors: Waseem Shahzad, Ayesha Tahir Khan, Hamid Hussain Awan
Abstract:
Ant-Miner algorithm that lies under ACO algorithms is used to extract knowledge from data in the form of rules. A variant of Ant-Miner algorithm named as cAnt-MinerPB is used to generate list of rules using pittsburgh approach in order to maintain the rule interaction among the rules that are generated. In this paper, we propose a parallel Ant MinerPB in which Ant colony optimization algorithm runs parallel. In this technique, a data set is divided vertically (i-e attributes) into different subsets. These subsets are created based on the correlation among attributes using Mutual Information (MI). It generates rules in a parallel manner and then merged to form a final list of rules. The results have shown that the proposed technique achieved higher accuracy when compared with original cAnt-MinerPB and also the execution time has also reduced.Keywords: ant colony optimization, parallel Ant-MinerPB, vertical partitioning, classification rule discovery
Procedia PDF Downloads 2958561 Optimization Analysis of a Concentric Tube Heat Exchanger with Field Synergy Principle
Abstract:
The paper investigates the optimization analysis to the heat exchanger design, mainly with response surface method and genetic algorithm to explore the relationship between optimal fluid flow velocity and temperature of the heat exchanger using field synergy principle. First, finite volume method is proposed to calculate the flow temperature and flow rate distribution for numerical analysis. We identify the most suitable simulation equations by response surface methodology. Furthermore, a genetic algorithm approach is applied to optimize the relationship between fluid flow velocity and flow temperature of the heat exchanger. The results show that the field synergy angle plays vital role in the performance of a true heat exchanger.Keywords: optimization analysis, field synergy, heat exchanger, genetic algorithm
Procedia PDF Downloads 307