Search results for: modified simplex algorithm
5055 Earthquake Relocations and Constraints on the Lateral Velocity Variations along the Gulf of Suez, Using the Modified Joint Hypocenter Method Determination
Authors: Abu Bakr Ahmed Shater
Abstract:
Hypocenters of 250 earthquakes recorded by more than 5 stations from the Egyptian seismic network around the Gulf of Suez were relocated and the seismic stations correction for the P-wave is estimated, using the modified joint hypocenter method determination. Five stations TR1, SHR, GRB, ZAF and ZET have minus signs in the station P-wave travel time corrections and their values are -0.235, -0.366, -0.288, -0.366 and -0.058, respectively. It is possible to assume that, the underground model in this area has a particular characteristic of high velocity structure in which the other stations TR2, RDS, SUZ, HRG and ZNM have positive signs and their values are 0.024, 0.187, 0.314, 0.645 and 0.145, respectively. It is possible to assume that, the underground model in this area has particular characteristic of low velocity structure. The hypocenteral location determined by the Modified joint hypocenter method is more precise than those determined by the other routine work program. This method simultaneously solves the earthquake locations and station corrections. The station corrections reflect, not only the different crustal conditions in the vicinity of the stations, but also the difference between the actual and modeled seismic velocities along each of the earthquake - station ray paths. The stations correction obtained is correlated with the major surface geological features in the study area. As a result of the relocation, the low velocity area appears in the northeastern and southwestern sides of the Gulf of Suez, while the southeastern and northwestern parts are of high velocity area.Keywords: gulf of Suez, seismicity, relocation of hypocenter, joint hypocenter determination
Procedia PDF Downloads 3595054 Reducing Total Harmonic Content of 9-Level Inverter by Use of Cuckoo Algorithm
Authors: Mahmoud Enayati, Sirous Mohammadi
Abstract:
In this paper, a novel procedure to find the firing angles of the multilevel inverters of supply voltage and, consequently, to decline the total harmonic distortion (THD), has been presented. In order to eliminate more harmonics in the multilevel inverters, its number of levels can be lessened or pulse width modulation waveform, in which more than one switching occur in each level, be used. Both cases complicate the non-algebraic equations and their solution cannot be performed by the conventional methods for the numerical solution of nonlinear equations such as Newton-Raphson method. In this paper, Cuckoo algorithm is used to compute the optimal firing angle of the pulse width modulation voltage waveform in the multilevel inverter. These angles should be calculated in such a way that the voltage amplitude of the fundamental frequency be generated while the total harmonic distortion of the output voltage be small. The simulation and theoretical results for the 9-levels inverter offer the high applicability of the proposed algorithm to identify the suitable firing angles for declining the low order harmonics and generate a waveform whose total harmonic distortion is very small and it is almost a sinusoidal waveform.Keywords: evolutionary algorithms, multilevel inverters, total harmonic content, Cuckoo Algorithm
Procedia PDF Downloads 5345053 A Case Study of Bee Algorithm for Ready Mixed Concrete Problem
Authors: Wuthichai Wongthatsanekorn, Nuntana Matheekrieangkrai
Abstract:
This research proposes Bee Algorithm (BA) to optimize Ready Mixed Concrete (RMC) truck scheduling problem from single batch plant to multiple construction sites. This problem is considered as an NP-hard constrained combinatorial optimization problem. This paper provides the details of the RMC dispatching process and its related constraints. BA was then developed to minimize total waiting time of RMC trucks while satisfying all constraints. The performance of BA is then evaluated on two benchmark problems (3 and 5construction sites) according to previous researchers. The simulation results of BA are compared in term of efficiency and accuracy with Genetic Algorithm (GA) and all problems show that BA approach outperforms GA in term of efficiency and accuracy to obtain optimal solution. Hence, BA approach could be practically implemented to obtain the best schedule.Keywords: bee colony optimization, ready mixed concrete problem, ruck scheduling, multiple construction sites
Procedia PDF Downloads 3855052 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory
Authors: Damir Latypov
Abstract:
A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory
Procedia PDF Downloads 1565051 Isoflavone and Mineral Content in Conventional Commercial Soybean Cultivars and Transgenic Soybean Planted in Minas Gerais, Brazil
Authors: Renata Adriana Labanca, Gabriela Rezende Costa, Nilton de Oliveira Couto e Silva, José Marcos Gontijo Mandarino, Rodrigo Santos Leite, Nilson César Castanheira Guimarães, Roberto Gonçalves Junqueira
Abstract:
The objective of this study was to evaluate the differences in composition between six brands of conventional soybean and six genetically modified cultivars (GM), all of them from Minas Gerais State, Brazil. We focused on the isoflavones profile and mineral content questioning the substantial equivalence between conventional and GM organisms. The statement of compliance label for conventional grains was verified for the presence of genetic modified genes by real time polymerase chain reaction (PCR). We did not detect the presence of the 35S promoter in commercial samples, indicating the absence of transgene insertion. For mineral analysis, we used the method of inductively coupled plasma-optical emission spectrometry (ICP-OES). Isoflavones quantification was performed by high performance liquid chromatography (HPLC). The results showed no statistical difference between the conventional and transgenic soybean groups concerning isoflavone content and mineral composition. The concentration of potassium, the main mineral component of soy, was the highest in conventional soybeans compared to that in GM soy, while GM samples presented the highest concentrations of iron.Keywords: glycine max, genetically modified organism, bioactive compounds, ICP-OES, HPLC
Procedia PDF Downloads 4585050 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm
Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh
Abstract:
this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.Keywords: genetic algorithm, information retrieval, optimal queries, crossover
Procedia PDF Downloads 2945049 Evaluation of Real-Time Background Subtraction Technique for Moving Object Detection Using Fast-Independent Component Analysis
Authors: Naoum Abderrahmane, Boumehed Meriem, Alshaqaqi Belal
Abstract:
Background subtraction algorithm is a larger used technique for detecting moving objects in video surveillance to extract the foreground objects from a reference background image. There are many challenges to test a good background subtraction algorithm, like changes in illumination, dynamic background such as swinging leaves, rain, snow, and the changes in the background, for example, moving and stopping of vehicles. In this paper, we propose an efficient and accurate background subtraction method for moving object detection in video surveillance. The main idea is to use a developed fast-independent component analysis (ICA) algorithm to separate background, noise, and foreground masks from an image sequence in practical environments. The fast-ICA algorithm is adapted and adjusted with a matrix calculation and searching for an optimum non-quadratic function to be faster and more robust. Moreover, in order to estimate the de-mixing matrix and the denoising de-mixing matrix parameters, we propose to convert all images to YCrCb color space, where the luma component Y (brightness of the color) gives suitable results. The proposed technique has been verified on the publicly available datasets CD net 2012 and CD net 2014, and experimental results show that our algorithm can detect competently and accurately moving objects in challenging conditions compared to other methods in the literature in terms of quantitative and qualitative evaluations with real-time frame rate.Keywords: background subtraction, moving object detection, fast-ICA, de-mixing matrix
Procedia PDF Downloads 975048 Methaheuristic Bat Algorithm in Training of Feed-Forward Neural Network for Stock Price Prediction
Authors: Marjan Golmaryami, Marzieh Behzadi
Abstract:
Recent developments in stock exchange highlight the need for an efficient and accurate method that helps stockholders make better decision. Since stock markets have lots of fluctuations during the time and different effective parameters, it is difficult to make good decisions. The purpose of this study is to employ artificial neural network (ANN) which can deal with time series data and nonlinear relation among variables to forecast next day stock price. Unlike other evolutionary algorithms which were utilized in stock exchange prediction, we trained our proposed neural network with metaheuristic bat algorithm, with fast and powerful convergence and applied it in stock price prediction for the first time. In order to prove the performance of the proposed method, this research selected a 7 year dataset from Parsian Bank stocks and after imposing data preprocessing, used 3 types of ANN (back propagation-ANN, particle swarm optimization-ANN and bat-ANN) to predict the closed price of stocks. Afterwards, this study engaged MATLAB to simulate 3 types of ANN, with the scoring target of mean absolute percentage error (MAPE). The results may be adapted to other companies stocks too.Keywords: artificial neural network (ANN), bat algorithm, particle swarm optimization algorithm (PSO), stock exchange
Procedia PDF Downloads 5495047 Development of Soil Test Kits to Determine Organic Matter Available Phosphorus and Exchangeable Potassium in Thailand
Authors: Charirat Kusonwiriyawong, Supha Photichan, Wannarut Chutibutr
Abstract:
Soil test kits for rapid analysis of the organic matter, available phosphorus and exchangeable potassium were developed to drive a low-cost field testing kit to farmers. The objective was to provide a decision tool for improving soil fertility. One aspect of soil test kit development was ease of use which is a time requirement for completing organic matter, available phosphorus and exchangeable potassium test in one soil sample. This testing kit required only two extractions and utilized no filtration consuming approximately 15 minutes per sample. Organic matter was principally created by oxidizing carbon KMnO₄ using the standard color chart. In addition, modified single extractant (Mehlich I) was applied to extract available phosphorus and exchangeable potassium. Molybdenum blue method and turbidimetric method using standard color chart were adapted to analyze available phosphorus and exchangeable potassium, respectively. Modified single extractant using in soil test kits were highly significant matching with analytical laboratory results (r=0.959** and 0.945** for available phosphorus and exchangeable potassium, respectively). Linear regressions were statistically calculated between modified single extractant and standard laboratory analysis (y=0.9581x-12.973 for available phosphorus and y=0.5372x+15.283 for exchangeable potassium, respectively). These equations were calibrated to formulate a fertilizer rate recommendation for specific corps. To validate quality, soil test kits were distributed to farmers and extension workers. We found that the accuracy of soil test kits were 71.0%, 63.9% and 65.5% for organic matter, available phosphorus, and exchangeable potassium, respectively. The quantitative survey was also conducted in order to assess their satisfaction with soil test kits. The survey showed that more than 85% of respondents said these testing kits were more convenient, economical and reliable than the other commercial soil test kits. Based upon the finding of this study, soil test kits can be another alternative for providing soil analysis and fertility recommendations when a soil testing laboratory is not available.Keywords: available phosphorus, exchangeable potassium, modified single extractant, organic matter, soil test kits
Procedia PDF Downloads 1485046 The Effect of Particle Porosity in Mixed Matrix Membrane Permeation Models
Authors: Z. Sadeghi, M. R. Omidkhah, M. E. Masoomi
Abstract:
The purpose of this paper is to examine gas transport behavior of mixed matrix membranes (MMMs) combined with porous particles. Main existing models are categorized in two main groups; two-phase (ideal contact) and three-phase (non-ideal contact). A new coefficient, J, was obtained to express equations for estimating effect of the particle porosity in two-phase and three-phase models. Modified models evaluates with existing models and experimental data using Matlab software. Comparison of gas permeability of proposed modified models with existing models in different MMMs shows a better prediction of gas permeability in MMMs.Keywords: mixed matrix membrane, permeation models, porous particles, porosity
Procedia PDF Downloads 3855045 Classification Rule Discovery by Using Parallel Ant Colony Optimization
Authors: Waseem Shahzad, Ayesha Tahir Khan, Hamid Hussain Awan
Abstract:
Ant-Miner algorithm that lies under ACO algorithms is used to extract knowledge from data in the form of rules. A variant of Ant-Miner algorithm named as cAnt-MinerPB is used to generate list of rules using pittsburgh approach in order to maintain the rule interaction among the rules that are generated. In this paper, we propose a parallel Ant MinerPB in which Ant colony optimization algorithm runs parallel. In this technique, a data set is divided vertically (i-e attributes) into different subsets. These subsets are created based on the correlation among attributes using Mutual Information (MI). It generates rules in a parallel manner and then merged to form a final list of rules. The results have shown that the proposed technique achieved higher accuracy when compared with original cAnt-MinerPB and also the execution time has also reduced.Keywords: ant colony optimization, parallel Ant-MinerPB, vertical partitioning, classification rule discovery
Procedia PDF Downloads 2965044 Optimization Analysis of a Concentric Tube Heat Exchanger with Field Synergy Principle
Abstract:
The paper investigates the optimization analysis to the heat exchanger design, mainly with response surface method and genetic algorithm to explore the relationship between optimal fluid flow velocity and temperature of the heat exchanger using field synergy principle. First, finite volume method is proposed to calculate the flow temperature and flow rate distribution for numerical analysis. We identify the most suitable simulation equations by response surface methodology. Furthermore, a genetic algorithm approach is applied to optimize the relationship between fluid flow velocity and flow temperature of the heat exchanger. The results show that the field synergy angle plays vital role in the performance of a true heat exchanger.Keywords: optimization analysis, field synergy, heat exchanger, genetic algorithm
Procedia PDF Downloads 3085043 A Novel Breast Cancer Detection Algorithm Using Point Region Growing Segmentation and Pseudo-Zernike Moments
Authors: Aileen F. Wang
Abstract:
Mammography has been one of the most reliable methods for early detection and diagnosis of breast cancer. However, mammography misses about 17% and up to 30% of breast cancers due to the subtle and unstable appearances of breast cancer in their early stages. Recent computer-aided diagnosis (CADx) technology using Zernike moments has improved detection accuracy. However, it has several drawbacks: it uses manual segmentation, Zernike moments are not robust, and it still has a relatively high false negative rate (FNR)–17.6%. This project will focus on the development of a novel breast cancer detection algorithm to automatically segment the breast mass and further reduce FNR. The algorithm consists of automatic segmentation of a single breast mass using Point Region Growing Segmentation, reconstruction of the segmented breast mass using Pseudo-Zernike moments, and classification of the breast mass using the root mean square (RMS). A comparative study among the various algorithms on the segmentation and reconstruction of breast masses was performed on randomly selected mammographic images. The results demonstrated that the newly developed algorithm is the best in terms of accuracy and cost effectiveness. More importantly, the new classifier RMS has the lowest FNR–6%.Keywords: computer aided diagnosis, mammography, point region growing segmentation, pseudo-zernike moments, root mean square
Procedia PDF Downloads 4535042 Faulty Sensors Detection in Planar Array Antenna Using Pelican Optimization Algorithm
Authors: Shafqat Ullah Khan, Ammar Nasir
Abstract:
Using planar antenna array (PAA) in radars, Broadcasting, satellite antennas, and sonar for the detection of targets, Helps provide instant beam pattern control. High flexibility and Adaptability are achieved by multiple beam steering by using a Planar array and are particularly needed in real-life Sanrio’s where the need arises for several high-directivity beams. Faulty sensors in planar arrays generate asymmetry, which leads to service degradation, radiation pattern distortion, and increased levels of sidelobe. The POA, a nature-inspired optimization algorithm, accurately determines faulty sensors within an array, enhancing the reliability and performance of planar array antennas through extensive simulations and experiments. The analysis was done for different types of faults in 7 x 7 and 8 x 8 planar arrays in MATLAB.Keywords: Planar antenna array, , Pelican optimisation Algorithm, , Faculty sensor, Antenna arrays
Procedia PDF Downloads 845041 A Problem with IFOC and a New PWM Based 180 Degree Conduction Mode
Authors: Usman Nasir, Minxiao Han, S. M. R. Kazmi
Abstract:
Three phase inverters being used today are based on field orientation control (FOC) and sine wave PWM (SPWM) techniques because 120 degree or 180 degree conduction methods produce high value of THD (total harmonic distortion) in the power system. The indirect field orientation control (IFOC) method is difficult to implement in real systems due to speed sensor accuracy issue. This paper discusses the problem with IFOC and a PWM based 180 degree conduction mode for the three phase inverter. The modified control method improves THD and this paper also compares the results obtained using modified control method with the conventional 180 degree conduction mode.Keywords: three phase inverters, IFOC, THD, sine wave PWM (SPWM)
Procedia PDF Downloads 4285040 Sustainable Dyeing of Cotton and Polyester Blend Fabric without Reduction Clearing
Authors: Mohammad Tofayel Ahmed, Seung Kook An
Abstract:
In contemporary research world, focus is more set on sustainable products and innovative processes. The global textile industries are putting tremendous effort to achieve a balance between economic development and ecological protection concurrently. The conservation of water sources and environment have become immensely significant issue in textile dyeing production. Accordingly, an attempt has been taken in this study to develop a process to dye polyester blend cotton without reduction clearing process and any extra wash off chemical by simple modification aiming at cost reduction and sustainability. A widely used combination of 60/40 cotton/polyester (c/p) single jersey knitted fabric of 30’s, 180 g/m² was considered for study. Traditionally, pretreatment is done followed by polyester part dyeing, reduction clearing and cotton part dyeing for c/p blend dyeing. But in this study, polyester part is dyed right away followed by pretreatment process and cotton part dyeing by skipping the reduction clearing process diametrically. The dyed samples of both traditional and modified samples were scrutinized by various color fastness tests, dyeing parameters and by consumption of water, steam, power, process time and total batch cost. The modified process in this study showed no necessity of reduction clearing process for polyester blend cotton dyeing. The key issue contributing to avoid the reduction clearing after polyester part dyeing has been the multifunctional effect of NaOH and H₂O₂ while pretreatment of cotton after polyester part dyeing. The results also revealed that the modified process could reduce the consumption of water, steam, power, time and cost remarkably. The bulk trial of modified process demonstrated the well exploitability to dye polyester blend cotton substrate ensuring all fastness and dyeing properties regardless of dyes category, blend ratio, color, and shade percentage thus making the process sustainable, eco-friendly and economical. Furthermore, the proposed method could be applicable to any cellulosic blend with polyester.Keywords: cotton, dyeing, economical, polyester
Procedia PDF Downloads 1915039 Using LTE-Sim in New Hanover Decision Algorithm for 2-Tier Macrocell-Femtocell LTE Network
Authors: Umar D. M., Aminu A. M., Izaddeen K. Y.
Abstract:
Deployments of mini macrocell base stations also referred to as femtocells, improve the quality of service of indoor and outdoor users. Nevertheless, mobility management remains a key issue with regards to their deployment. This paper is leaned towards this issue, with an in-depth focus on the most important aspect of mobility management -handover. In handover management, making a handover decision in the LTE two-tier macrocell femtocell network is a crucial research area. Decision algorithms in this research are classified and comparatively analyzed according to received signal strength, user equipment speed, cost function, and interference. However, it was observed that most of the discussed decision algorithms fail to consider cell selection with hybrid access policy in a single macrocell multiple femtocell scenario, another observation was a majority of these algorithms lack the incorporation of user equipment residence parameter. Not including this parameter boosts the number of unnecessary handover occurrence. To deal with these issues, a sophisticated handover decision algorithm is proposed. The proposed algorithm considers the user’s velocity, received signal strength, residence time, as well as the femtocell base station’s access policy. Simulation results have shown that the proposed algorithm reduces the number of unnecessary handovers when compared to conventional received signal strength-based handover decision algorithm.Keywords: user-equipment, radio signal service, long term evolution, mobility management, handoff
Procedia PDF Downloads 1255038 Generation of Photo-Mosaic Images through Block Matching and Color Adjustment
Authors: Hae-Yeoun Lee
Abstract:
Mosaic refers to a technique that makes image by gathering lots of small materials in various colours. This paper presents an automatic algorithm that makes the photomosaic image using photos. The algorithm is composed of four steps: Partition and feature extraction, block matching, redundancy removal and colour adjustment. The input image is partitioned in the small block to extract feature. Each block is matched to find similar photo in database by comparing similarity with Euclidean difference between blocks. The intensity of the block is adjusted to enhance the similarity of image by replacing the value of light and darkness with that of relevant block. Further, the quality of image is improved by minimizing the redundancy of tiles in the adjacent blocks. Experimental results support that the proposed algorithm is excellent in quantitative analysis and qualitative analysis.Keywords: photomosaic, Euclidean distance, block matching, intensity adjustment
Procedia PDF Downloads 2805037 Simple Fabrication of Au (111)-Like Electrode and Its Applications to Electrochemical Determination of Dopamine and Ascorbic Acid
Authors: Zahrah Thamer Althagafi, Mohamed I. Awad
Abstract:
A simple method for the fabrication of Au (111)-like electrode via controlled reductive desorption of a pre-adsorbed cysteine monolayer onto polycrystalline gold (poly-Au) electrode is introduced. Then, the voltammetric behaviour of dopamine (DA) and ascorbic acid (AA) on the thus modified electrode is investigated. Electrochemical characterization of the modified electrode is achieved using cyclic voltammetry and square wave voltammetry. For the binary mixture of DA and AA, the results showed that Au (111)-like electrode exhibits excellent electrocatalytic activity towards the oxidation of DA and AA. This allows highly selective and simultaneous determination of DA and AA. The effect of various experimental parameters on the voltammetric responses of DA and AA was investigated. The enrichment of the Au (111) facet of the poly-Au electrode is thought to be behind the electrocatalytic activity.Keywords: gold electrode, electroanalysis, electrocatalysis, monolayers, self-assembly, cysteine, dopamine, ascorbic acid
Procedia PDF Downloads 1965036 Converting Scheduling Time into Calendar Date Considering Non-Interruptible Construction Tasks
Authors: Salman Ali Nisar, Suzuki Koji
Abstract:
In this paper we developed a new algorithm to convert the project scheduling time into calendar date in order to handle non-interruptible activities not to be split by non-working days (such as weekend and holidays). In a construction project some activities might require not to be interrupted even on non-working days, or to be finished on the end day of business days. For example, concrete placing work might be required to be completed by the end day of weekdays i.e. Friday, and curing in the weekend. This research provides an algorithm that imposes time constraint for start and finish times of non-interruptible activities. The algorithm converts working days, which is obtained by Critical Path Method (CPM), to calendar date with consideration of the start date of a project. After determining the interruption by non-working days, the start time of a certain activity should be postponed, if there is enough total float value. Otherwise, the duration is shortened by hiring additional resources capacity or/and using overtime work execution. Then, time constraints are imposed to start time and finish time of the activity. The algorithm is developed in Excel Spreadsheet for microcomputer and therefore we can easily get a feasible, calendared construction schedule for such a construction project with some non-interruptible activities.Keywords: project management, scheduling, critical path method, time constraint, non-interruptible tasks
Procedia PDF Downloads 5025035 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation
Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta
Abstract:
Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal
Procedia PDF Downloads 3225034 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography
Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya
Abstract:
In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography
Procedia PDF Downloads 2915033 Vibration Frequency Analysis of Sandwich Nano-Plate on Visco Pasternak Foundation by Using Modified Couple Stress Theory
Authors: Hamed Khani Arani, Mohammad Shariyat, Armaghan Mohammadian
Abstract:
In this research, the free vibration of a rectangular sandwich nano-plate (SNP) made of three smart layers in the visco Pasternak foundation is studied. The core of the sandwich is a piezo magnetic nano-plate integrated with two layers of piezoelectric materials. First-order shear deformation plate theory is utilized to derive the motion equations by using Hamilton’s principle, piezoelectricity, and modified couple stress theory. Elastic medium is modeled by visco Pasternak foundation, where the damping coefficient effect is investigated on the stability of sandwich nano-plate. These equations are solved by the differential quadrature method (DQM), considering different boundary conditions. Results indicate the effect of various parameters such as aspect ratio, thickness ratio, shear correction factor, damping coefficient, and boundary conditions on the dimensionless frequency of sandwich nano-plate. The results are also compared by those available in the literature, and these findings can be used for automotive industry, communications equipment, active noise, stability, and vibration cancellation systems and utilized for designing the magnetostrictive actuator, motor, transducer and sensors in nano and micro smart structures.Keywords: free vibration, modified couple stress theory, sandwich nano-plate, visco Pasternak foundation
Procedia PDF Downloads 1395032 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 3005031 Iterative Method for Lung Tumor Localization in 4D CT
Authors: Sarah K. Hagi, Majdi Alnowaimi
Abstract:
In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.Keywords: automated algorithm , computed tomography, lung tumor, tumor localization
Procedia PDF Downloads 6055030 Induction Motor Eccentricity Fault Recognition Using Rotor Slot Harmonic with Stator Current Technique
Authors: Nouredine Benouzza, Ahmed Hamida Boudinar, Azeddine Bendiabdellah
Abstract:
An algorithm for Eccentricity Fault Detection (EFD) applied to a squirrel cage induction machine is proposed in this paper. This algorithm employs the behavior of the stator current spectral analysis and the localization of the Rotor Slot Harmonic (RSH) frequency to detect eccentricity faults in three phase induction machine. The RHS frequency once obtained is used as a key parameter into a simple developed expression to directly compute the eccentricity fault frequencies in the induction machine. Experimental tests performed for both a healthy motor and a faulty motor with different eccentricity fault severities illustrate the effectiveness and merits of the proposed EFD algorithm.Keywords: squirrel cage motor, diagnosis, eccentricity faults, current spectral analysis, rotor slot harmonic
Procedia PDF Downloads 4905029 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 3505028 Adsorptive Performance of Surface Modified Montmorillonite in Vanadium Removal from Real Mine Water
Authors: Opeyemi Atiba-Oyewo, Taile Y. Leswfi, Maurice S. Onyango, Christian Wolkersdorfer
Abstract:
This paper describes the preparation of surface modified montmorillonite using hexadecyltrimethylammonium bromide (HDTMA-Br) for the removal of vanadium from mine water. The adsorbent before and after adsorption was characterised by Fourier transform infra-red (FT-IR), X-ray diffraction (XRD) and scanning electron microscopy (SEM), while the amount of vanadium adsorbed was determined by ICP-OES. The batch adsorption method was employed using vanadium concentrations in solution ranging from 50 to 320 mg/L and vanadium tailings seepage water from a South African mine. Also, solution pH, temperature and sorbent mass were varied. Results show that the adsorption capacity was affected by solution pH, temperature, sorbent mass and the initial concentration. Electrical conductivity of the mine water before and after adsorption was measured to estimate the total dissolved solids in the mine water. Equilibrium isotherm results revealed that vanadium sorption follows the Freundlich isotherm, indicating that the surface of the sorbent was heterogeneous. The pseudo-second order kinetic model gave the best fit to the kinetic experimental data compared to the first order and Elovich models. The results of this study may be used to predict the uptake efficiency of South Africa montmorillonite in view of its application for the removal of vanadium from mine water. However, the choice of this adsorbent for the uptake of vanadium or other contaminants will depend on the composition of the effluent to be treated.Keywords: adsorption, vanadium, modified montmorillonite, equilibrium, kinetics, mine water
Procedia PDF Downloads 4355027 Radial Distribution Network Reliability Improvement by Using Imperialist Competitive Algorithm
Authors: Azim Khodadadi, Sahar Sadaat Vakili, Ebrahim Babaei
Abstract:
This study presents a numerical method to optimize the failure rate and repair time of a typical radial distribution system. Failure rate and repair time are effective parameters in customer and energy based indices of reliability. Decrease of these parameters improves reliability indices. Thus, system stability will be boost. The penalty functions indirectly reflect the cost of investment which spent to improve these indices. Constraints on customer and energy based indices, i.e. SAIFI, SAIDI, CAIDI and AENS have been considered by using a new method which reduces optimization algorithm controlling parameters. Imperialist Competitive Algorithm (ICA) used as main optimization technique and particle swarm optimization (PSO), simulated annealing (SA) and differential evolution (DE) has been applied for further investigation. These algorithms have been implemented on a test system by MATLAB. Obtained results have been compared with each other. The optimized values of repair time and failure rate are much lower than current values which this achievement reduced investment cost and also ICA gives better answer than the other used algorithms.Keywords: imperialist competitive algorithm, failure rate, repair time, radial distribution network
Procedia PDF Downloads 6695026 An Algorithm for Estimating the Stable Operation Conditions of the Synchronous Motor of the Ore Mill Electric Drive
Authors: M. Baghdasaryan, A. Sukiasyan
Abstract:
An algorithm for estimating the stable operation conditions of the synchronous motor of the ore mill electric drive is proposed. The stable operation conditions of the synchronous motor are revealed, taking into account the estimation of the q angle change and the technological factors. The stability condition obtained allows to ensure the stable operation of the motor in the synchronous mode, taking into account the nonlinear character of the mill loading. The developed algorithm gives an opportunity to present the undesirable phenomena, arising in the electric drive system. The obtained stability condition can be successfully applied for the optimal control of the electromechanical system of the mill.Keywords: electric drive, synchronous motor, ore mill, stability, technological factors
Procedia PDF Downloads 425