Search results for: ant colony algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2151

Search results for: ant colony algorithms

1971 Heterogenous Dimensional Super Resolution of 3D CT Scans Using Transformers

Authors: Helen Zhang

Abstract:

Accurate segmentation of the airways from CT scans is crucial for early diagnosis of lung cancer. However, the existing airway segmentation algorithms often rely on thin-slice CT scans, which can be inconvenient and costly. This paper presents a set of machine learning-based 3D super-resolution algorithms along heterogeneous dimensions to improve the resolution of thicker CT scans to reduce the reliance on thin-slice scans. To evaluate the efficacy of the super-resolution algorithms, quantitative assessments using PSNR (Peak Signal to Noise Ratio) and SSIM (Structural SIMilarity index) were performed. The impact of super-resolution on airway segmentation accuracy is also studied. The proposed approach has the potential to make airway segmentation more accessible and affordable, thereby facilitating early diagnosis and treatment of lung cancer.

Keywords: 3D super-resolution, airway segmentation, thin-slice CT scans, machine learning

Procedia PDF Downloads 80
1970 Host Preference, Impact of Host Transfer and Insecticide Susceptibility among Aphis gossypii Group (Order: Hemiptera) in Jamaica

Authors: Desireina Delancy, Tannice Hall, Eric Garraway, Dwight Robinson

Abstract:

Aphis gossypii, as a pest, directly damages its host plant by extracting phloem sap (sucking) and indirectly damages it by the transmission of viruses, ultimately affecting the yield of the host. Due to its polyphagous nature, this species affects a wide range of host plants, some of which may serve as a reservoir for colonisation of important crops. In Jamaica, there have been outbreaks of viral plant pathogens that were transmitted by Aphis gossypii. Three such examples are Citrus tristeza virus, the Watermelon mosaic virus, and Papaya ringspot virus. Aphis gossypii also heavily colonized economically significant host plants, including pepper, eggplant, watermelon, cucumber, and hibiscus. To facilitate integrated pest management, it is imperative to understand the biology of the aphid and its host preference. Preliminary work in Jamaica has indicated differences in biology and host preference, as well as host variety within the species. However, specific details of fecundity, colony growth, host preference, distribution, and insecticide resistance of Aphis gossypii were unknown to the best of our knowledge. The aim was to investigate the following in relation to Aphis gossypii: influence of the host plant on colonization, life span, fecundity, population size, and morphology; the impact of host transfer on fecundity and population size as a measure of host preference and host transfer success and susceptibility to four commonly used insecticides. Fecundity and colony size were documented daily from aphids acclimatized on Capsicum chinense Jacquin 1776, Cucumis sativus Linnaeus 1630, Gossypium hirsutum Linnaeus 1751 and Abelmoschus esculentus (L.) Moench 1794 for three generations. The same measures were used after third instar aphids were transferred among the hosts as a measure of suitability and success. Mortality, and fecundity of survivors, were determined after aphids were exposed to varying concentrations of Actara®, Diazinon™, Karate Zeon®, and Pegasus®. Host preference results indicated that, over a 24-day period, Aphis gossypii reached its largest colony size on G. hirsutum (x̄ 381.80), with January – February being the most fecund period. Host transfer experiments were all significantly different, with the most significant occurring between transfers from C. chinense to C. sativus (p < 0.05). Colony sizes were found to increase significantly every 5 days, which has implications for regimes implemented to monitor and evaluate plots. Insecticides ranked on lethality are Karate Zeon®> Actara®> Pegasus® > Diazinon™. The highest LC50 values were obtained for aphids on G. hirsutum and C. chinense was with Pegasus® and for those on C. sativus with Diazinon™. Survivors of insecticide treatments had colony sizes on average that were 98 % less than untreated aphids. Cotton was preferred both in the field and in the glasshouse. It is on cotton the aphids settled first, had the highest fecundity, and the lowest mortality. Cotton can serve as reservoir for (re)populating other cotton or different host species based on migration due to overcrowding, heavy showers, high wind, or ant attendance. Host transfer success between all three hosts is highly probable within an intercropping system. Survivors of insecticide treatments can successfully repopulate host plants.

Keywords: Aphis gossypii, host-plant preference, colonization sequence, host transfers, insecticide susceptibility

Procedia PDF Downloads 54
1969 Virtual 3D Environments for Image-Based Navigation Algorithms

Authors: V. B. Bastos, M. P. Lima, P. R. G. Kurka

Abstract:

This paper applies to the creation of virtual 3D environments for the study and development of mobile robot image based navigation algorithms and techniques, which need to operate robustly and efficiently. The test of these algorithms can be performed in a physical way, from conducting experiments on a prototype, or by numerical simulations. Current simulation platforms for robotic applications do not have flexible and updated models for image rendering, being unable to reproduce complex light effects and materials. Thus, it is necessary to create a test platform that integrates sophisticated simulated applications of real environments for navigation, with data and image processing. This work proposes the development of a high-level platform for building 3D model’s environments and the test of image-based navigation algorithms for mobile robots. Techniques were used for applying texture and lighting effects in order to accurately represent the generation of rendered images regarding the real world version. The application will integrate image processing scripts, trajectory control, dynamic modeling and simulation techniques for physics representation and picture rendering with the open source 3D creation suite - Blender.

Keywords: simulation, visual navigation, mobile robot, data visualization

Procedia PDF Downloads 226
1968 Transfer Knowledge From Multiple Source Problems to a Target Problem in Genetic Algorithm

Authors: Terence Soule, Tami Al Ghamdi

Abstract:

To study how to transfer knowledge from multiple source problems to the target problem, we modeled the Transfer Learning (TL) process using Genetic Algorithms as the model solver. TL is the process that aims to transfer learned data from one problem to another problem. The TL process aims to help Machine Learning (ML) algorithms find a solution to the problems. The Genetic Algorithms (GA) give researchers access to information that we have about how the old problem is solved. In this paper, we have five different source problems, and we transfer the knowledge to the target problem. We studied different scenarios of the target problem. The results showed combined knowledge from multiple source problems improves the GA performance. Also, the process of combining knowledge from several problems results in promoting diversity of the transferred population.

Keywords: transfer learning, genetic algorithm, evolutionary computation, source and target

Procedia PDF Downloads 112
1967 Design and Performance Analysis of Resource Management Algorithms in Response to Emergency and Disaster Situations

Authors: Volkan Uygun, H. Birkan Yilmaz, Tuna Tugcu

Abstract:

This study focuses on the development and use of algorithms that address the issue of resource management in response to emergency and disaster situations. The presented system, named Disaster Management Platform (DMP), takes the data from the data sources of service providers and distributes the incoming requests accordingly both to manage load balancing and minimize service time, which results in improved user satisfaction. Three different resource management algorithms, which give different levels of importance to load balancing and service time, are proposed for the study. The first one is the Minimum Distance algorithm, which assigns the request to the closest resource. The second one is the Minimum Load algorithm, which assigns the request to the resource with the minimum load. Finally, the last one is the Hybrid algorithm, which combines the previous two approaches. The performance of the proposed algorithms is evaluated with respect to waiting time, success ratio, and maximum load ratio. The metrics are monitored from simulations, to find the optimal scheme for different loads. Two different simulations are performed in the study, one is time-based and the other is lambda-based. The results indicate that, the Minimum Load algorithm is generally the best in all metrics whereas the Minimum Distance algorithm is the worst in all cases and in all metrics. The leading position in performance is switched between the Minimum Distance and the Hybrid algorithms, as lambda values change.

Keywords: emergency and disaster response, resource management algorithm, disaster situations, disaster management platform

Procedia PDF Downloads 313
1966 Effect of Personality Traits on Classification of Political Orientation

Authors: Vesile Evrim, Aliyu Awwal

Abstract:

Today as in the other domains, there are an enormous number of political transcripts available in the Web which is waiting to be mined and used for various purposes such as statistics and recommendations. Therefore, automatically determining the political orientation on these transcripts becomes crucial. The methodologies used by machine learning algorithms to do the automatic classification are based on different features such as Linguistic. Considering the ideology differences between Liberals and Conservatives, in this paper, the effect of Personality Traits on political orientation classification is studied. This is done by considering the correlation between LIWC features and the BIG Five Personality Traits. Several experiments are conducted on Convote U.S. Congressional-Speech dataset with seven benchmark classification algorithms. The different methodologies are applied on selecting different feature sets that constituted by 8 to 64 varying number of features. While Neuroticism is obtained to be the most differentiating personality trait on classification of political polarity, when its top 10 representative features are combined with several classification algorithms, it outperformed the results presented in previous research.

Keywords: politics, personality traits, LIWC, machine learning

Procedia PDF Downloads 465
1965 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm

Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim

Abstract:

All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.

Keywords: currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features

Procedia PDF Downloads 208
1964 Patient-Specific Modeling Algorithm for Medical Data Based on AUC

Authors: Guilherme Ribeiro, Alexandre Oliveira, Antonio Ferreira, Shyam Visweswaran, Gregory Cooper

Abstract:

Patient-specific models are instance-based learning algorithms that take advantage of the particular features of the patient case at hand to predict an outcome. We introduce two patient-specific algorithms based on decision tree paradigm that use AUC as a metric to select an attribute. We apply the patient specific algorithms to predict outcomes in several datasets, including medical datasets. Compared to the patient-specific decision path (PSDP) entropy-based and CART methods, the AUC-based patient-specific decision path models performed equivalently on area under the ROC curve (AUC). Our results provide support for patient-specific methods being a promising approach for making clinical predictions.

Keywords: approach instance-based, area under the ROC curve, patient-specific decision path, clinical predictions

Procedia PDF Downloads 448
1963 An Investigation Enhancing E-Voting Application Performance

Authors: Aditya Verma

Abstract:

E-voting using blockchain provides us with a distributed system where data is present on each node present in the network and is reliable and secure too due to its immutability property. This work compares various blockchain consensus algorithms used for e-voting applications in the past, based on performance and node scalability, and chooses the optimal one and improves on one such previous implementation by proposing solutions for the loopholes of the optimally working blockchain consensus algorithm, in our chosen application, e-voting.

Keywords: blockchain, parallel bft, consensus algorithms, performance

Procedia PDF Downloads 124
1962 Improve Closed Loop Performance and Control Signal Using Evolutionary Algorithms Based PID Controller

Authors: Mehdi Shahbazian, Alireza Aarabi, Mohsen Hadiyan

Abstract:

Proportional-Integral-Derivative (PID) controllers are the most widely used controllers in industry because of its simplicity and robustness. Different values of PID parameters make different step response, so an increasing amount of literature is devoted to proper tuning of PID controllers. The problem merits further investigation as traditional tuning methods make large control signal that can damages the system but using evolutionary algorithms based tuning methods improve the control signal and closed loop performance. In this paper three tuning methods for PID controllers have been studied namely Ziegler and Nichols, which is traditional tuning method and evolutionary algorithms based tuning methods, that are, Genetic algorithm and particle swarm optimization. To examine the validity of PSO and GA tuning methods a comparative analysis of DC motor plant is studied. Simulation results reveal that evolutionary algorithms based tuning method have improved control signal amplitude and quality factors of the closed loop system such as rise time, integral absolute error (IAE) and maximum overshoot.

Keywords: evolutionary algorithm, genetic algorithm, particle swarm optimization, PID controller

Procedia PDF Downloads 455
1961 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator

Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard

Abstract:

Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.

Keywords: blade tip timing, blisk, finite element, vibration measurement

Procedia PDF Downloads 284
1960 Investigation of Different Machine Learning Algorithms in Large-Scale Land Cover Mapping within the Google Earth Engine

Authors: Amin Naboureh, Ainong Li, Jinhu Bian, Guangbin Lei, Hamid Ebrahimy

Abstract:

Large-scale land cover mapping has become a new challenge in land change and remote sensing field because of involving a big volume of data. Moreover, selecting the right classification method, especially when there are different types of landscapes in the study area is quite difficult. This paper is an attempt to compare the performance of different machine learning (ML) algorithms for generating a land cover map of the China-Central Asia–West Asia Corridor that is considered as one of the main parts of the Belt and Road Initiative project (BRI). The cloud-based Google Earth Engine (GEE) platform was used for generating a land cover map for the study area from Landsat-8 images (2017) by applying three frequently used ML algorithms including random forest (RF), support vector machine (SVM), and artificial neural network (ANN). The selected ML algorithms (RF, SVM, and ANN) were trained and tested using reference data obtained from MODIS yearly land cover product and very high-resolution satellite images. The finding of the study illustrated that among three frequently used ML algorithms, RF with 91% overall accuracy had the best result in producing a land cover map for the China-Central Asia–West Asia Corridor whereas ANN showed the worst result with 85% overall accuracy. The great performance of the GEE in applying different ML algorithms and handling huge volume of remotely sensed data in the present study showed that it could also help the researchers to generate reliable long-term land cover change maps. The finding of this research has great importance for decision-makers and BRI’s authorities in strategic land use planning.

Keywords: land cover, google earth engine, machine learning, remote sensing

Procedia PDF Downloads 91
1959 Automatic Queuing Model Applications

Authors: Fahad Suleiman

Abstract:

Queuing, in medical system is the process of moving patients in a specific sequence to a specific service according to the patients’ nature of illness. The term scheduling stands for the process of computing a schedule. This may be done by a queuing based scheduler. This paper focuses on the medical consultancy system, the different queuing algorithms that are used in healthcare system to serve the patients, and the average waiting time. The aim of this paper is to build automatic queuing system for organizing the medical queuing system that can analyses the queue status and take decision which patient to serve. The new queuing architecture model can switch between different scheduling algorithms according to the testing results and the factor of the average waiting time. The main innovation of this work concerns the modeling of the average waiting time is taken into processing, in addition with the process of switching to the scheduling algorithm that gives the best average waiting time.

Keywords: queuing systems, queuing system models, scheduling algorithms, patients

Procedia PDF Downloads 323
1958 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation

Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta

Abstract:

Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.

Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal

Procedia PDF Downloads 285
1957 Analytical Comparison of Conventional Algorithms with Vedic Algorithm for Digital Multiplier

Authors: Akhilesh G. Naik, Dipankar Pal

Abstract:

In today’s scenario, the complexity of digital signal processing (DSP) applications and various microcontroller architectures have been increasing to such an extent that the traditional approaches to multiplier design in most processors are becoming outdated for being comparatively slow. Modern processing applications require suitable pipelined approaches, and therefore, algorithms that are friendlier with pipelined architectures. Traditional algorithms like Wallace Tree, Radix-4 Booth, Radix-8 Booth, Dadda architectures have been proven to be comparatively slow for pipelined architectures. These architectures, therefore, need to be optimized or combined with other architectures amongst them to enhance its performances and to be made suitable for pipelined hardware/architectures. Recently, Vedic algorithm mathematically has proven to be efficient by appearing to be less complex and with fewer steps for its output establishment and have assumed renewed importance. This paper describes and shows how the Vedic algorithm can be better suited for pipelined architectures and also can be combined with traditional architectures and algorithms for enhancing its ability even further. In this paper, we also established that for complex applications on DSP and other microcontroller architectures, using Vedic approach for multiplication proves to be the best available and efficient option.

Keywords: Wallace Tree, Radix-4 Booth, Radix-8 Booth, Dadda, Vedic, Single-Stage Karatsuba (SSK), Looped Karatsuba (LK)

Procedia PDF Downloads 141
1956 Grey Wolf Optimization Technique for Predictive Analysis of Products in E-Commerce: An Adaptive Approach

Authors: Shital Suresh Borse, Vijayalaxmi Kadroli

Abstract:

E-commerce industries nowadays implement the latest AI, ML Techniques to improve their own performance and prediction accuracy. This helps to gain a huge profit from the online market. Ant Colony Optimization, Genetic algorithm, Particle Swarm Optimization, Neural Network & GWO help many e-commerce industries for up-gradation of their predictive performance. These algorithms are providing optimum results in various applications, such as stock price prediction, prediction of drug-target interaction & user ratings of similar products in e-commerce sites, etc. In this study, customer reviews will play an important role in prediction analysis. People showing much interest in buying a lot of services& products suggested by other customers. This ultimately increases net profit. In this work, a convolution neural network (CNN) is proposed which further is useful to optimize the prediction accuracy of an e-commerce website. This method shows that CNN is used to optimize hyperparameters of GWO algorithm using an appropriate coding scheme. Accurate model results are verified by comparing them to PSO results whose hyperparameters have been optimized by CNN in Amazon's customer review dataset. Here, experimental outcome proves that this proposed system using the GWO algorithm achieves superior execution in terms of accuracy, precision, recovery, etc. in prediction analysis compared to the existing systems.

Keywords: prediction analysis, e-commerce, machine learning, grey wolf optimization, particle swarm optimization, CNN

Procedia PDF Downloads 83
1955 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 310
1954 Analysis of Veterinary Drug Residues and Pesticide Residues in Beehive Products

Authors: Alba Luna Jimenez, Maria Dolores Hernando

Abstract:

The administration of veterinary treatments at higher doses than the recommended Varroa mite control in beehive matrices has the potential to generate residues in the honeybee colony and in the derived products for consumption. Honeybee colonies can also be indirectly exposed to residues of plant protection products when foraging in crops, wildflowers near the crops, or in urban gardens just after spraying. The study evaluates the presence of both types of residues, veterinary treatments, and pesticides in beeswax, bee bread, and honey. The study was carried out in apiaries located in agricultural zones and forest areas in Andalusia, Spain. Up to nineteen residues were identified above LOQ using gas chromatography-triple quadrupole-mass spectrometry analysis (GC-MS/MS). Samples were extracted by a modified QuEChERs method. Chlorfenvinphos was detected in beeswax and bee bread despite its use is not authorized for Varroa mite control. Residues of fluvalinate-tau, authorized as veterinary treatment, were detected in most of the samples of beeswax and bee bread, presumably due to overdose or also to its potential for accumulation associated with its marked liposolubility. Residues of plant protection products were also detected in samples of beeswax and bee bread. Pesticide residues were detected above the LOQ that was established at 5 µg.kg⁻¹, which is the minimum concentration that can be quantified with acceptable accuracy and precision, as described in the European guidelines for pesticide residue analysis SANTE/11945/2015. No residues of phytosanitary treatments used in agriculture were detected in honey.

Keywords: honeybee colony, mass spectrometry analysis, pesticide residues, Varroa destructor, veterinary treatment

Procedia PDF Downloads 125
1953 Image Segmentation Techniques: Review

Authors: Lindani Mbatha, Suvendi Rimer, Mpho Gololo

Abstract:

Image segmentation is the process of dividing an image into several sections, such as the object's background and the foreground. It is a critical technique in both image-processing tasks and computer vision. Most of the image segmentation algorithms have been developed for gray-scale images and little research and algorithms have been developed for the color images. Most image segmentation algorithms or techniques vary based on the input data and the application. Nearly all of the techniques are not suitable for noisy environments. Most of the work that has been done uses the Markov Random Field (MRF), which involves the computations and is said to be robust to noise. In the past recent years' image segmentation has been brought to tackle problems such as easy processing of an image, interpretation of the contents of an image, and easy analysing of an image. This article reviews and summarizes some of the image segmentation techniques and algorithms that have been developed in the past years. The techniques include neural networks (CNN), edge-based techniques, region growing, clustering, and thresholding techniques and so on. The advantages and disadvantages of medical ultrasound image segmentation techniques are also discussed. The article also addresses the applications and potential future developments that can be done around image segmentation. This review article concludes with the fact that no technique is perfectly suitable for the segmentation of all different types of images, but the use of hybrid techniques yields more accurate and efficient results.

Keywords: clustering-based, convolution-network, edge-based, region-growing

Procedia PDF Downloads 55
1952 THz Phase Extraction Algorithms for a THz Modulating Interferometric Doppler Radar

Authors: Shaolin Allen Liao, Hual-Te Chien

Abstract:

Various THz phase extraction algorithms have been developed for a novel THz Modulating Interferometric Doppler Radar (THz-MIDR) developed recently by the author. The THz-MIDR differs from the well-known FTIR technique in that it introduces a continuously modulating reference branch, compared to the time-consuming discrete FTIR stepping reference branch. Such change allows real-time tracking of a moving object and capturing of its Doppler signature. The working principle of the THz-MIDR is similar to the FTIR technique: the incoming THz emission from the scene is split by a beam splitter/combiner; one of the beams is continuously modulated by a vibrating mirror or phase modulator and the other split beam is reflected by a reflection mirror; finally both the modulated reference beam and reflected beam are combined by the same beam splitter/combiner and detected by a THz intensity detector (for example, a pyroelectric detector). In order to extract THz phase from the single intensity measurement signal, we have derived rigorous mathematical formulas for 3 Frequency Banded (FB) signals: 1) DC Low-Frequency Banded (LFB) signal; 2) Fundamental Frequency Banded (FFB) signal; and 3) Harmonic Frequency Banded (HFB) signal. The THz phase extraction algorithms are then developed based combinations of 2 or all of these 3 FB signals with efficient algorithms such as Levenberg-Marquardt nonlinear fitting algorithm. Numerical simulation has also been performed in Matlab with simulated THz-MIDR interferometric signal of various Signal to Noise Ratio (SNR) to verify the algorithms.

Keywords: algorithm, modulation, THz phase, THz interferometry doppler radar

Procedia PDF Downloads 300
1951 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data

Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan

Abstract:

Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.

Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data

Procedia PDF Downloads 413
1950 Approach Based on Fuzzy C-Means for Band Selection in Hyperspectral Images

Authors: Diego Saqui, José H. Saito, José R. Campos, Lúcio A. de C. Jorge

Abstract:

Hyperspectral images and remote sensing are important for many applications. A problem in the use of these images is the high volume of data to be processed, stored and transferred. Dimensionality reduction techniques can be used to reduce the volume of data. In this paper, an approach to band selection based on clustering algorithms is presented. This approach allows to reduce the volume of data. The proposed structure is based on Fuzzy C-Means (or K-Means) and NWHFC algorithms. New attributes in relation to other studies in the literature, such as kurtosis and low correlation, are also considered. A comparison of the results of the approach using the Fuzzy C-Means and K-Means with different attributes is performed. The use of both algorithms show similar good results but, particularly when used attributes variance and kurtosis in the clustering process, however applicable in hyperspectral images.

Keywords: band selection, fuzzy c-means, k-means, hyperspectral image

Procedia PDF Downloads 369
1949 A Hybrid Distributed Algorithm for Multi-Objective Dynamic Flexible Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a hybrid distributed algorithm has been suggested for multi-objective dynamic flexible job shop scheduling problem. The proposed algorithm is high level, in which several algorithms search the space on different machines simultaneously also it is a hybrid algorithm that takes advantages of the artificial intelligence, evolutionary and optimization methods. Distribution is done at different levels and new approaches are used for design of the algorithm. Apache spark and Hadoop frameworks have been used for the distribution of the algorithm. The Pareto optimality approach is used for solving the multi-objective benchmarks. The suggested algorithm that is able to solve large-size problems in short times has been compared with the successful algorithms of the literature. The results prove high speed and efficiency of the algorithm.

Keywords: distributed algorithms, apache-spark, Hadoop, flexible dynamic job shop scheduling, multi-objective optimization

Procedia PDF Downloads 316
1948 A Practical Approach Towards Disinfection Challenges in Sterile Manufacturing Area

Authors: Doris Lacej, Eni Bushi

Abstract:

Cleaning and disinfection procedures are essential for maintaining the cleanliness status of the pharmaceutical manufacturing environment particularly of the cleanrooms and sterile unit area. The Good Manufacturing Practice (GMP) Annex 1 recommendation highly requires the implementation of the standard and validated cleaning and disinfection protocols. However, environmental monitoring has shown that even a validated cleaning method with certified agents may result in the presence of atypical microorganisms’ colony that exceeds GMP limits for a specific cleanroom area. In response to this issue, this case study aims to arrive at the root cause of the microbial contamination observed in the sterile production environment in Profarma pharmaceutical industry in Albania through applying a problem-solving practical approach that ensures the appropriate sterility grade. The guidelines and literature emphasize the importance of several factors in the prevention of possible microbial contamination occurring in cleanrooms, grade A and C. These factors are integrated into a practical framework, to identify the root cause of the presence of Aspergillus Niger colony in the sterile production environment in Profarma pharmaceutical industry in Albania. In addition, the application of a semi-automatic disinfecting system such as H2O2 FOG into sterile grade A and grade C cleanrooms has been an effective solution in eliminating the atypical colony of Aspergillus Niger. Selecting the appropriate detergents and disinfectants at the right concentration, frequency, and combination; the presence of updated and standardized guidelines for cleaning and disinfection as well as continuous training of operators on these practices in accordance with the updated GMP guidelines are some of the identified factors that influence the success of achieving sterility grade. However, to ensure environmental sustainability it is important to be prepared for identifying the source of contamination and making the appropriate decision. The proposed case-based practical approach may help pharmaceutical companies to achieve sterile production and cleanliness environmental sustainability in challenging situations. Apart from the integration of valid agents and standardized cleaning and disinfection protocols according to GMP Annex 1, pharmaceutical companies must be careful and investigate the source and all the steps that can influence the results of an abnormal situation. Subsequently apart from identifying the root cause it is important to solve the problem with a successful alternative approach.

Keywords: cleanrooms, disinfectants, environmental monitoring, GMP Annex 1

Procedia PDF Downloads 184
1947 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 336
1946 Performance Analysis and Multi-Objective Optimization of a Kalina Cycle for Low-Temperature Applications

Authors: Sadegh Sadeghi, Negar Shabani

Abstract:

From a thermal point of view, zeotropic mixtures are likely to be more efficient than azeotropic fluids in low-temperature thermodynamic cycles due to their suitable boiling characteristics. In this study, performance of a low-temperature Kalina cycle with R717/water working fluid used in different existing power plants is mathematically investigated. To analyze the behavior of the cycle, mass conservation, energy conservation, and exergy balance equations are presented. With regard to the similarity in molar mass of R717 (17.03 gr/mol) and water (18.01 gr/mol), there is no need to alter the size of Kalina system components such as turbine and pump. To optimize the cycle energy and exergy efficiencies simultaneously, a constrained multi-objective optimization is carried out applying an Artificial Bee Colony algorithm. The main motivation behind using this algorithm lies on its robustness, reliability, remarkable precision and high–speed convergence rate in dealing with complicated constrained multi-objective problems. Convergence rates of the algorithm for calculating the optimal energy and exergy efficiencies are presented. Subsequently, due to the importance of exergy concept in Kalina cycles, exergy destructions occurring in the components are computed. Finally, the impacts of pressure, temperature, mass fraction and mass flow rate on the energy and exergy efficiencies are elaborately studied.

Keywords: artificial bee colony algorithm, binary zeotropic mixture, constrained multi-objective optimization, energy efficiency, exergy efficiency, Kalina cycle

Procedia PDF Downloads 120
1945 Opportunities for Precision Feed in Apiculture

Authors: John Michael Russo

Abstract:

Honeybees are important to our food system and continue to suffer from high rates of colony loss. Precision feed has brought many benefits to livestock cultivation and these should transfer to apiculture. However, apiculture has unique challenges. The objective of this research is to understand how principles of precision agriculture, applied to apiculture and feed specifically, might effectively improve state-of-the-art cultivation. The methodology surveys apicultural practice to build a model for assessment. First, a review of apicultural motivators is made. Feed method is then evaluated. Finally, precision feed methods are examined as accelerants with potential to advance the effectiveness of feed practice. Six important motivators emerge: colony loss, disease, climate change, site variance, operational costs, and competition. Feed practice itself is used to compensate for environmental variables. The research finds that the current state-of-the-art in apiculture feed focuses on critical challenges in the management of feed schedules which satisfy requirements of the bees, preserve potency, optimize environmental variables, and manage costs. Many of the challenges are most acute when feed is used to dispense medication. Technology such as RNA treatments have even more rigorous demands. Precision feed solutions focus on strategies which accommodate specific needs of individual livestock. A major component is data; they integrate precise data with methods that respond to individual needs. There is enormous opportunity for precision feed to improve apiculture through the integration of precision data with policies to translate data into optimized action in the apiary, particularly through automation.

Keywords: precision agriculture, precision feed, apiculture, honeybees

Procedia PDF Downloads 49
1944 Enhancing Precision Agriculture through Object Detection Algorithms: A Study of YOLOv5 and YOLOv8 in Detecting Armillaria spp.

Authors: Christos Chaschatzis, Chrysoula Karaiskou, Pantelis Angelidis, Sotirios K. Goudos, Igor Kotsiuba, Panagiotis Sarigiannidis

Abstract:

Over the past few decades, the rapid growth of the global population has led to the need to increase agricultural production and improve the quality of agricultural goods. There is a growing focus on environmentally eco-friendly solutions, sustainable production, and biologically minimally fertilized products in contemporary society. Precision agriculture has the potential to incorporate a wide range of innovative solutions with the development of machine learning algorithms. YOLOv5 and YOLOv8 are two of the most advanced object detection algorithms capable of accurately recognizing objects in real time. Detecting tree diseases is crucial for improving the food production rate and ensuring sustainability. This research aims to evaluate the efficacy of YOLOv5 and YOLOv8 in detecting the symptoms of Armillaria spp. in sweet cherry trees and determining their health status, with the goal of enhancing the robustness of precision agriculture. Additionally, this study will explore Computer Vision (CV) techniques with machine learning algorithms to improve the detection process’s efficiency.

Keywords: Armillaria spp., machine learning, precision agriculture, smart farming, sweet cherries trees, YOLOv5, YOLOv8

Procedia PDF Downloads 80
1943 Discriminant Analysis as a Function of Predictive Learning to Select Evolutionary Algorithms in Intelligent Transportation System

Authors: Jorge A. Ruiz-Vanoye, Ocotlán Díaz-Parra, Alejandro Fuentes-Penna, Daniel Vélez-Díaz, Edith Olaco García

Abstract:

In this paper, we present the use of the discriminant analysis to select evolutionary algorithms that better solve instances of the vehicle routing problem with time windows. We use indicators as independent variables to obtain the classification criteria, and the best algorithm from the generic genetic algorithm (GA), random search (RS), steady-state genetic algorithm (SSGA), and sexual genetic algorithm (SXGA) as the dependent variable for the classification. The discriminant classification was trained with classic instances of the vehicle routing problem with time windows obtained from the Solomon benchmark. We obtained a classification of the discriminant analysis of 66.7%.

Keywords: Intelligent Transportation Systems, data-mining techniques, evolutionary algorithms, discriminant analysis, machine learning

Procedia PDF Downloads 434
1942 Diffusion Adaptation Strategies for Distributed Estimation Based on the Family of Affine Projection Algorithms

Authors: Mohammad Shams Esfand Abadi, Mohammad Ranjbar, Reza Ebrahimpour

Abstract:

This work presents the distributed processing solution problem in a diffusion network based on the adapt then combine (ATC) and combine then adapt (CTA)selective partial update normalized least mean squares (SPU-NLMS) algorithms. Also, we extend this approach to dynamic selection affine projection algorithm (DS-APA) and ATC-DS-APA and CTA-DS-APA are established. The purpose of ATC-SPU-NLMS and CTA-SPU-NLMS algorithm is to reduce the computational complexity by updating the selected blocks of weight coefficients at every iteration. In CTA-DS-APA and ATC-DS-APA, the number of the input vectors is selected dynamically. Diffusion cooperation strategies have been shown to provide good performance based on these algorithms. The good performance of introduced algorithm is illustrated with various experimental results.

Keywords: selective partial update, affine projection, dynamic selection, diffusion, adaptive distributed networks

Procedia PDF Downloads 671