Search results for: precise time domain expanding algorithm
21254 Tool for Metadata Extraction and Content Packaging as Endorsed in OAIS Framework
Authors: Payal Abichandani, Rishi Prakash, Paras Nath Barwal, B. K. Murthy
Abstract:
Information generated from various computerization processes is a potential rich source of knowledge for its designated community. To pass this information from generation to generation without modifying the meaning is a challenging activity. To preserve and archive the data for future generations it’s very essential to prove the authenticity of the data. It can be achieved by extracting the metadata from the data which can prove the authenticity and create trust on the archived data. Subsequent challenge is the technology obsolescence. Metadata extraction and standardization can be effectively used to resolve and tackle this problem. Metadata can be categorized at two levels i.e. Technical and Domain level broadly. Technical metadata will provide the information that can be used to understand and interpret the data record, but only this level of metadata isn’t sufficient to create trustworthiness. We have developed a tool which will extract and standardize the technical as well as domain level metadata. This paper is about the different features of the tool and how we have developed this.Keywords: digital preservation, metadata, OAIS, PDI, XML
Procedia PDF Downloads 39321253 Modification of Rk Equation of State for Liquid and Vapor of Ammonia by Genetic Algorithm
Authors: S. Mousavian, F. Mousavian, V. Nikkhah Rashidabad
Abstract:
Cubic equations of state like Redlich–Kwong (RK) EOS have been proved to be very reliable tools in the prediction of phase behavior. Despite their good performance in compositional calculations, they usually suffer from weaknesses in the predictions of saturated liquid density. In this research, RK equation was modified. The result of this study shows that modified equation has good agreement with experimental data.Keywords: equation of state, modification, ammonia, genetic algorithm
Procedia PDF Downloads 38321252 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography
Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya
Abstract:
In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography
Procedia PDF Downloads 29121251 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 30021250 Induction Motor Eccentricity Fault Recognition Using Rotor Slot Harmonic with Stator Current Technique
Authors: Nouredine Benouzza, Ahmed Hamida Boudinar, Azeddine Bendiabdellah
Abstract:
An algorithm for Eccentricity Fault Detection (EFD) applied to a squirrel cage induction machine is proposed in this paper. This algorithm employs the behavior of the stator current spectral analysis and the localization of the Rotor Slot Harmonic (RSH) frequency to detect eccentricity faults in three phase induction machine. The RHS frequency once obtained is used as a key parameter into a simple developed expression to directly compute the eccentricity fault frequencies in the induction machine. Experimental tests performed for both a healthy motor and a faulty motor with different eccentricity fault severities illustrate the effectiveness and merits of the proposed EFD algorithm.Keywords: squirrel cage motor, diagnosis, eccentricity faults, current spectral analysis, rotor slot harmonic
Procedia PDF Downloads 49021249 Sliding Mode Control of a Bus Suspension System
Authors: Mujde Turkkan, Nurkan Yagiz
Abstract:
The vibrations, caused by the irregularities of the road surface, are to be suppressed via suspension systems. In this paper, sliding mode control for a half bus model with air suspension system is presented. The bus is modelled as five degrees of freedom (DoF) system. The mathematical model of the half bus is developed using Lagrange Equations. For time domain analysis, the bus model is assumed to travel at certain speed over the bump road. The numerical results of the analysis indicate that the sliding mode controllers can be effectively used to suppress the vibrations and to improve the ride comfort of the busses.Keywords: active suspension system, air suspension, bus model, sliding mode control
Procedia PDF Downloads 38821248 A Non-Destructive TeraHertz System and Method for Capsule and Liquid Medicine Identification
Authors: Ke Lin, Steve Wu Qing Yang, Zhang Nan
Abstract:
The medicine and drugs has in the past been manufactured to the final products and then used laboratory analysis to verify their quality. However the industry needs crucially a monitoring technique for the final batch to batch quality check. The introduction of process analytical technology (PAT) provides an incentive to obtain real-time information about drugs on the production line, with the following optical techniques being considered: near-infrared (NIR) spectroscopy, Raman spectroscopy and imaging, mid-infrared spectroscopy with the use of chemometric techniques to quantify the final product. However, presents problems in that the spectra obtained will consist of many combination and overtone bands of the fundamental vibrations observed, making analysis difficult. In this work, we describe a non-destructive system and method for capsule and liquid medicine identification, more particularly, using terahertz time-domain spectroscopy and/or designed terahertz portable system for identifying different types of medicine in the package of capsule or in liquid medicine bottles. The target medicine can be detected directly, non-destructively and non-invasively.Keywords: terahertz, non-destructive, non-invasive, chemical identification
Procedia PDF Downloads 13221247 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 35021246 Impact of Population Size on Symmetric Travelling Salesman Problem Efficiency
Authors: Wafa' Alsharafat, Suhila Farhan Abu-Owida
Abstract:
Genetic algorithm (GA) is a powerful evolutionary searching technique that is used successfully to solve and optimize problems in different research areas. Genetic Algorithm (GA) considered as one of optimization methods used to solve Travel salesman Problem (TSP). The feasibility of GA in finding a TSP solution is dependent on GA operators; encoding method, population size, termination criteria, in general. In specific, crossover and its probability play a significant role in finding possible solutions for Symmetric TSP (STSP). In addition, the crossover should be determined and enhanced in term reaching optimal or at least near optimal. In this paper, we spot the light on using a modified crossover method called modified sequential constructive crossover and its impact on reaching optimal solution. To justify the relevance of a parameter value in solving the TSP, a set comparative analysis conducted on different crossover methods values.Keywords: genetic algorithm, crossover, mutation, TSP
Procedia PDF Downloads 22921245 Comparison of Machine Learning and Deep Learning Algorithms for Automatic Classification of 80 Different Pollen Species
Authors: Endrick Barnacin, Jean-Luc Henry, Jimmy Nagau, Jack Molinie
Abstract:
Palynology is a field of interest in many disciplines due to its multiple applications: chronological dating, climatology, allergy treatment, and honey characterization. Unfortunately, the analysis of a pollen slide is a complicated and time consuming task that requires the intervention of experts in the field, which are becoming increasingly rare due to economic and social conditions. That is why the need for automation of this task is urgent. A lot of studies have investigated the subject using different standard image processing descriptors and sometimes hand-crafted ones.In this work, we make a comparative study between classical feature extraction methods (Shape, GLCM, LBP, and others) and Deep Learning (CNN, Autoencoders, Transfer Learning) to perform a recognition task over 80 regional pollen species. It has been found that the use of Transfer Learning seems to be more precise than the other approachesKeywords: pollens identification, features extraction, pollens classification, automated palynology
Procedia PDF Downloads 13721244 Vehicle Detection and Tracking Using Deep Learning Techniques in Surveillance Image
Authors: Abe D. Desta
Abstract:
This study suggests a deep learning-based method for identifying and following moving objects in surveillance video. The proposed method uses a fast regional convolution neural network (F-RCNN) trained on a substantial dataset of vehicle images to first detect vehicles. A Kalman filter and a data association technique based on a Hungarian algorithm are then used to monitor the observed vehicles throughout time. However, in general, F-RCNN algorithms have been shown to be effective in achieving high detection accuracy and robustness in this research study. For example, in one study The study has shown that the vehicle detection and tracking, the system was able to achieve an accuracy of 97.4%. In this study, the F-RCNN algorithm was compared to other popular object detection algorithms and was found to outperform them in terms of both detection accuracy and speed. The presented system, which has application potential in actual surveillance systems, shows the usefulness of deep learning approaches in vehicle detection and tracking.Keywords: artificial intelligence, computer vision, deep learning, fast-regional convolutional neural networks, feature extraction, vehicle tracking
Procedia PDF Downloads 12921243 An Algorithm for Estimating the Stable Operation Conditions of the Synchronous Motor of the Ore Mill Electric Drive
Authors: M. Baghdasaryan, A. Sukiasyan
Abstract:
An algorithm for estimating the stable operation conditions of the synchronous motor of the ore mill electric drive is proposed. The stable operation conditions of the synchronous motor are revealed, taking into account the estimation of the q angle change and the technological factors. The stability condition obtained allows to ensure the stable operation of the motor in the synchronous mode, taking into account the nonlinear character of the mill loading. The developed algorithm gives an opportunity to present the undesirable phenomena, arising in the electric drive system. The obtained stability condition can be successfully applied for the optimal control of the electromechanical system of the mill.Keywords: electric drive, synchronous motor, ore mill, stability, technological factors
Procedia PDF Downloads 42521242 Use of Galileo Advanced Features in Maritime Domain
Authors: Olivier Chaigneau, Damianos Oikonomidis, Marie-Cecile Delmas
Abstract:
GAMBAS (Galileo Advanced features for the Maritime domain: Breakthrough Applications for Safety and security) is a project funded by the European Space Program Agency (EUSPA) aiming at identifying the search-and-rescue and ship security alert system needs for maritime users (including operators and fishing stakeholders) and developing operational concepts to answer these needs. The general objective of the GAMBAS project is to support the deployment of Galileo exclusive features in the maritime domain in order to improve safety and security at sea, detection of illegal activities and associated surveillance means, resilience to natural and human-induced emergency situations, and develop, integrate, demonstrate, standardize and disseminate these new associated capabilities. The project aims to demonstrate: improvement of the SAR (Search And Rescue) and SSAS (Ship Security Alert System) detection and response to maritime distress through the integration of new features into the beacon for SSAS in terms of cost optimization, user-friendly aspects, integration of Galileo and OS NMA (Open Service Navigation Message Authentication) reception for improved authenticated localization performance and reliability, and at sea triggering capabilities, optimization of the responsiveness of RCCs (Rescue Co-ordination Centre) towards the distress situations affecting vessels, the adaptation of the MCCs (Mission Control Center) and MEOLUT (Medium Earth Orbit Local User Terminal) to the data distribution of SSAS alerts.Keywords: Galileo new advanced features, maritime, safety, security
Procedia PDF Downloads 9321241 Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process
Authors: Hamed Nafar, Sajad Rezaei, Hamid Behbahani
Abstract:
Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various parameters including the driver-vehicle characteristics.Keywords: road safety, prioritizing, Quasi-induced exposure, Analytical Hierarchy Process
Procedia PDF Downloads 34021240 Short Life Cycle Time Series Forecasting
Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar
Abstract:
The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.Keywords: forecast, short life cycle product, structured judgement, time series
Procedia PDF Downloads 36021239 Particle Size Distribution Estimation of a Mixture of Regular and Irregular Sized Particles Using Acoustic Emissions
Authors: Ejay Nsugbe, Andrew Starr, Ian Jennions, Cristobal Ruiz-Carcel
Abstract:
This works investigates the possibility of using Acoustic Emissions (AE) to estimate the Particle Size Distribution (PSD) of a mixture of particles that comprise of particles of different densities and geometry. The experiments carried out involved the mixture of a set of glass and polyethylene particles that ranged from 150-212 microns and 150-250 microns respectively and an experimental rig that allowed the free fall of a continuous stream of particles on a target plate which the AE sensor was placed. By using a time domain based multiple threshold method, it was observed that the PSD of the particles in the mixture could be estimated.Keywords: acoustic emissions, particle sizing, process monitoring, signal processing
Procedia PDF Downloads 35321238 Comparison Study of Machine Learning Classifiers for Speech Emotion Recognition
Authors: Aishwarya Ravindra Fursule, Shruti Kshirsagar
Abstract:
In the intersection of artificial intelligence and human-centered computing, this paper delves into speech emotion recognition (SER). It presents a comparative analysis of machine learning models such as K-Nearest Neighbors (KNN),logistic regression, support vector machines (SVM), decision trees, ensemble classifiers, and random forests, applied to SER. The research employs four datasets: Crema D, SAVEE, TESS, and RAVDESS. It focuses on extracting salient audio signal features like Zero Crossing Rate (ZCR), Chroma_stft, Mel Frequency Cepstral Coefficients (MFCC), root mean square (RMS) value, and MelSpectogram. These features are used to train and evaluate the models’ ability to recognize eight types of emotions from speech: happy, sad, neutral, angry, calm, disgust, fear, and surprise. Among the models, the Random Forest algorithm demonstrated superior performance, achieving approximately 79% accuracy. This suggests its suitability for SER within the parameters of this study. The research contributes to SER by showcasing the effectiveness of various machine learning algorithms and feature extraction techniques. The findings hold promise for the development of more precise emotion recognition systems in the future. This abstract provides a succinct overview of the paper’s content, methods, and results.Keywords: comparison, ML classifiers, KNN, decision tree, SVM, random forest, logistic regression, ensemble classifiers
Procedia PDF Downloads 4521237 Heuristic Methods for the Capacitated Location- Allocation Problem with Stochastic Demand
Authors: Salinee Thumronglaohapun
Abstract:
The proper number and appropriate locations of service centers can save cost, raise revenue and gain more satisfaction from customers. Establishing service centers is high-cost and difficult to relocate. In long-term planning periods, several factors may affect the service. One of the most critical factors is uncertain demand of customers. The opened service centers need to be capable of serving customers and making a profit although the demand in each period is changed. In this work, the capacitated location-allocation problem with stochastic demand is considered. A mathematical model is formulated to determine suitable locations of service centers and their allocation to maximize total profit for multiple planning periods. Two heuristic methods, a local search and genetic algorithm, are used to solve this problem. For the local search, five different chances to choose each type of moves are applied. For the genetic algorithm, three different replacement strategies are considered. The results of applying each method to solve numerical examples are compared. Both methods reach to the same best found solution in most examples but the genetic algorithm provides better solutions in some cases.Keywords: location-allocation problem, stochastic demand, local search, genetic algorithm
Procedia PDF Downloads 12521236 A Support Vector Machine Learning Prediction Model of Evapotranspiration Using Real-Time Sensor Node Data
Authors: Waqas Ahmed Khan Afridi, Subhas Chandra Mukhopadhyay, Bandita Mainali
Abstract:
The research paper presents a unique approach to evapotranspiration (ET) prediction using a Support Vector Machine (SVM) learning algorithm. The study leverages real-time sensor node data to develop an accurate and adaptable prediction model, addressing the inherent challenges of traditional ET estimation methods. The integration of the SVM algorithm with real-time sensor node data offers great potential to improve spatial and temporal resolution in ET predictions. In the model development, key input features are measured and computed using mathematical equations such as Penman-Monteith (FAO56) and soil water balance (SWB), which include soil-environmental parameters such as; solar radiation (Rs), air temperature (T), atmospheric pressure (P), relative humidity (RH), wind speed (u2), rain (R), deep percolation (DP), soil temperature (ST), and change in soil moisture (∆SM). The one-year field data are split into combinations of three proportions i.e. train, test, and validation sets. While kernel functions with tuning hyperparameters have been used to train and improve the accuracy of the prediction model with multiple iterations. This paper also outlines the existing methods and the machine learning techniques to determine Evapotranspiration, data collection and preprocessing, model construction, and evaluation metrics, highlighting the significance of SVM in advancing the field of ET prediction. The results demonstrate the robustness and high predictability of the developed model on the basis of performance evaluation metrics (R2, RMSE, MAE). The effectiveness of the proposed model in capturing complex relationships within soil and environmental parameters provide insights into its potential applications for water resource management and hydrological ecosystem.Keywords: evapotranspiration, FAO56, KNIME, machine learning, RStudio, SVM, sensors
Procedia PDF Downloads 7021235 The Conceptual Design Model of an Automated Supermarket
Authors: V. Sathya Narayanan, P. Sidharth, V. R. Sanal Kumar
Abstract:
The success of any retail business is predisposed by its swift response and its knack in understanding the constraints and the requirements of customers. In this paper a conceptual design model of an automated customer-friendly supermarket has been proposed. In this model a 10-sided, space benefited, regular polygon shaped gravity shelves have been designed for goods storage and effective customer-specific algorithms have been built-in for quick automatic delivery of the randomly listed goods. The algorithm is developed with two main objectives, viz., delivery time and priority. For meeting these objectives the randomly listed items are reorganized according to the critical-path of the robotic arm specific to the identified shop and its layout and the items are categorized according to the demand, shape, size, similarity and nature of the product for an efficient pick-up, packing and delivery process. We conjectured that the proposed automated supermarket model reduces business operating costs with much customer satisfaction warranting a win-win situation.Keywords: automated supermarket, electronic shopping, polygon-shaped rack, shortest path algorithm for shopping
Procedia PDF Downloads 40621234 Two Efficient Heuristic Algorithms for the Integrated Production Planning and Warehouse Layout Problem
Authors: Mohammad Pourmohammadi Fallah, Maziar Salahi
Abstract:
In the literature, a mixed-integer linear programming model for the integrated production planning and warehouse layout problem is proposed. To solve the model, the authors proposed a Lagrangian relax-and-fix heuristic that takes a significant amount of time to stop with gaps above 5$\%$ for large-scale instances. Here, we present two heuristic algorithms to solve the problem. In the first one, we use a greedy approach by allocating warehouse locations with less reservation costs and also less transportation costs from the production area to locations and from locations to the output point to items with higher demands. Then a smaller model is solved. In the second heuristic, first, we sort items in descending order according to the fraction of the sum of the demands for that item in the time horizon plus the maximum demand for that item in the time horizon and the sum of all its demands in the time horizon. Then we categorize the sorted items into groups of 3, 4, or 5 and solve a small-scale optimization problem for each group, hoping to improve the solution of the first heuristic. Our preliminary numerical results show the effectiveness of the proposed heuristics.Keywords: capacitated lot-sizing, warehouse layout, mixed-integer linear programming, heuristics algorithm
Procedia PDF Downloads 19721233 Multimedia Firearms Training System
Authors: Aleksander Nawrat, Karol Jędrasiak, Artur Ryt, Dawid Sobel
Abstract:
The goal of the article is to present a novel Multimedia Firearms Training System. The system was developed in order to compensate for major problems of existing shooting training systems. The designed and implemented solution can be characterized by five major advantages: algorithm for automatic geometric calibration, algorithm of photometric recalibration, firearms hit point detection using thermal imaging camera, IR laser spot tracking algorithm for after action review analysis, and implementation of ballistics equations. The combination of the abovementioned advantages in a single multimedia firearms training system creates a comprehensive solution for detecting and tracking of the target point usable for shooting training systems and improving intervention tactics of uniformed services. The introduced algorithms of geometric and photometric recalibration allow the use of economically viable commercially available projectors for systems that require long and intensive use without most of the negative impacts on color mapping of existing multi-projector multimedia shooting range systems. The article presents the results of the developed algorithms and their application in real training systems.Keywords: firearms shot detection, geometric recalibration, photometric recalibration, IR tracking algorithm, thermography, ballistics
Procedia PDF Downloads 22421232 Model-Based Software Regression Test Suite Reduction
Authors: Shiwei Deng, Yang Bao
Abstract:
In this paper, we present a model-based regression test suite reducing approach that uses EFSM model dependence analysis and probability-driven greedy algorithm to reduce software regression test suites. The approach automatically identifies the difference between the original model and the modified model as a set of elementary model modifications. The EFSM dependence analysis is performed for each elementary modification to reduce the regression test suite, and then the probability-driven greedy algorithm is adopted to select the minimum set of test cases from the reduced regression test suite that cover all interaction patterns. Our initial experience shows that the approach may significantly reduce the size of regression test suites.Keywords: dependence analysis, EFSM model, greedy algorithm, regression test
Procedia PDF Downloads 42921231 Biosorption of Heavy Metals from Aqueous Solutions by Plant Biomass
Authors: Yamina Zouambia, Khadidja Youcef Ettoumi, Mohamed Krea, Nadji Moulai Mostefa
Abstract:
Environment pollution through various wastes (particularly by heavy metals) is a major environmental problem due to industrialization and the development of various human activities. Considerable attention has been focused, in recent years, upon the field of biosorption which represents a biotechnological innovation as well as an excellent tool for removal of metal ions from aqueous effluents. So the purpose of this study is to valorize by-product which are orange peels and an extract of these peels (pectin; a heteropolysaccharide) in treatment of water containing heavy metals. All biosorption experiments were carried out at room temperature, an indicated pH, a precise amount of biosorbent and under continuous stirring. Biosorption kinetic was determined by evaluating the residual concentration of the metal ion at different time intervals using UV spectroscopy. The results obtained show that the orange peels and pectin are interesting biosorbents with maximum biosorption capacity of up to 140 mg/g.Keywords: orange peels, pectin, heavy metals, biosorption
Procedia PDF Downloads 33321230 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration
Authors: C. Iraklis, G. Evmiridis, A. Iraklis
Abstract:
Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.Keywords: congestion, distribution networks, loss reduction, particle swarm optimization, smart grid
Procedia PDF Downloads 44721229 Abdominal Organ Segmentation in CT Images Based On Watershed Transform and Mosaic Image
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
Accurate Liver, spleen and kidneys segmentation in abdominal CT images is one of the most important steps for computer aided abdominal organs pathology diagnosis. In this paper, we have proposed a new semi-automatic algorithm for Liver, spleen and kidneys area extraction in abdominal CT images. Our proposed method is based on hierarchical segmentation and watershed algorithm. In our approach, a powerful technique has been designed to suppress over-segmentation based on mosaic image and on the computation of the watershed transform. The algorithm is currency in two parts. In the first, we seek to improve the quality of the gradient-mosaic image. In this step, we propose a method for improving the gradient-mosaic image by applying the anisotropic diffusion filter followed by the morphological filters. Thereafter we proceed to the hierarchical segmentation of the liver, spleen and kidney. To validate the segmentation technique proposed, we have tested it on several images. Our segmentation approach is evaluated by comparing our results with the manual segmentation performed by an expert. The experimental results are described in the last part of this work.Keywords: anisotropic diffusion filter, CT images, morphological filter, mosaic image, multi-abdominal organ segmentation, mosaic image, the watershed algorithm
Procedia PDF Downloads 49921228 Encryption and Decryption of Nucleic Acid Using Deoxyribonucleic Acid Algorithm
Authors: Iftikhar A. Tayubi, Aabdulrahman Alsubhi, Abdullah Althrwi
Abstract:
The deoxyribonucleic acid text provides a single source of high-quality Cryptography about Deoxyribonucleic acid sequence for structural biologists. We will provide an intuitive, well-organized and user-friendly web interface that allows users to encrypt and decrypt Deoxy Ribonucleic Acid sequence text. It includes complex, securing by using Algorithm to encrypt and decrypt Deoxy Ribonucleic Acid sequence. The utility of this Deoxy Ribonucleic Acid Sequence Text is that, it can provide a user-friendly interface for users to Encrypt and Decrypt store the information about Deoxy Ribonucleic Acid sequence. These interfaces created in this project will satisfy the demands of the scientific community by providing fully encrypt of Deoxy Ribonucleic Acid sequence during this website. We have adopted a methodology by using C# and Active Server Page.NET for programming which is smart and secure. Deoxy Ribonucleic Acid sequence text is a wonderful piece of equipment for encrypting large quantities of data, efficiently. The users can thus navigate from one encoding and store orange text, depending on the field for user’s interest. Algorithm classification allows a user to Protect the deoxy ribonucleic acid sequence from change, whether an alteration or error occurred during the Deoxy Ribonucleic Acid sequence data transfer. It will check the integrity of the Deoxy Ribonucleic Acid sequence data during the access.Keywords: algorithm, ASP.NET, DNA, encrypt, decrypt
Procedia PDF Downloads 23521227 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach
Authors: M. Taheri Tehrani, H. Ajorloo
Abstract:
In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems
Procedia PDF Downloads 51921226 Optimal Type and Installation Time of Wind Farm in a Power System, Considering Service Providers
Authors: M. H. Abedi, A. Jalilvand
Abstract:
The economic development benefits of wind energy may be the most tangible basis for the local and state officials’ interests. In addition to the direct salaries associated with building and operating wind projects, the wind energy industry provides indirect jobs and benefits. The optimal planning of a wind farm is one most important topic in renewable energy technology. Many methods have been implemented to optimize the cost and output benefit of wind farms, but the contribution of this paper is mentioning different types of service providers and also time of installation of wind turbines during planning horizon years. Genetic algorithm (GA) is used to optimize the problem. It is observed that an appropriate layout of wind farm can cause to minimize the different types of cost.Keywords: renewable energy, wind farm, optimization, planning
Procedia PDF Downloads 52521225 A Fast Community Detection Algorithm
Authors: Chung-Yuan Huang, Yu-Hsiang Fu, Chuen-Tsai Sun
Abstract:
Community detection represents an important data-mining tool for analyzing and understanding real-world complex network structures and functions. We believe that at least four criteria determine the appropriateness of a community detection algorithm: (a) it produces useable normalized mutual information (NMI) and modularity results for social networks, (b) it overcomes resolution limitation problems associated with synthetic networks, (c) it produces good NMI results and performance efficiency for Lancichinetti-Fortunato-Radicchi (LFR) benchmark networks, and (d) it produces good modularity and performance efficiency for large-scale real-world complex networks. To our knowledge, no existing community detection algorithm meets all four criteria. In this paper, we describe a simple hierarchical arc-merging (HAM) algorithm that uses network topologies and rule-based arc-merging strategies to identify community structures that satisfy the criteria. We used five well-studied social network datasets and eight sets of LFR benchmark networks to validate the ground-truth community correctness of HAM, eight large-scale real-world complex networks to measure its performance efficiency, and two synthetic networks to determine its susceptibility to resolution limitation problems. Our results indicate that the proposed HAM algorithm is capable of providing satisfactory performance efficiency and that HAM-identified communities were close to ground-truth communities in social and LFR benchmark networks while overcoming resolution limitation problems.Keywords: complex network, social network, community detection, network hierarchy
Procedia PDF Downloads 229