Search results for: management algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 12901

Search results for: management algorithm

12331 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 108
12330 Recursive Parametric Identification of a Doubly Fed Induction Generator-Based Wind Turbine

Authors: A. El Kachani, E. Chakir, A. Ait Laachir, A. Niaaniaa, J. Zerouaoui

Abstract:

This document presents an adaptive controller based on recursive parametric identification applied to a wind turbine based on the doubly-fed induction machine (DFIG), to compensate the faults and guarantee efficient of the DFIG. The proposed adaptive controller is based on the recursive least square algorithm which considers that the best estimator for the vector parameter is the vector x minimizing a quadratic criterion. Furthermore, this method can improve the rapidity and precision of the controller based on a model. The proposed controller is validated via simulation on a 5.5 kW DFIG-based wind turbine. The results obtained seem to be good. In addition, they show the advantages of an adaptive controller based on recursive least square algorithm.

Keywords: adaptive controller, recursive least squares algorithm, wind turbine, doubly fed induction generator

Procedia PDF Downloads 288
12329 Space Telemetry Anomaly Detection Based On Statistical PCA Algorithm

Authors: Bassem Nassar, Wessam Hussein, Medhat Mokhtar

Abstract:

The crucial concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems in order to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important in order to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the aforementioned problem coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions and the results shows that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: space telemetry monitoring, multivariate analysis, PCA algorithm, space operations

Procedia PDF Downloads 415
12328 A Fast Version of the Generalized Multi-Directional Radon Transform

Authors: Ines Elouedi, Atef Hammouda

Abstract:

This paper presents a new fast version of the generalized Multi-Directional Radon Transform method. The new method uses the inverse Fast Fourier Transform to lead to a faster Generalized Radon projections. We prove in this paper that the fast algorithm leads to almost the same results of the eldest one but with a considerable lower time computation cost. The projection end result of the fast method is a parameterized Radon space where a high valued pixel allows the detection of a curve from the original image. The proposed fast inversion algorithm leads to an exact reconstruction of the initial image from the Radon space. We show examples of the impact of this algorithm on the pattern recognition domain.

Keywords: fast generalized multi-directional Radon transform, curve, exact reconstruction, pattern recognition

Procedia PDF Downloads 278
12327 Heuristic Search Algorithm (HSA) for Enhancing the Lifetime of Wireless Sensor Networks

Authors: Tripatjot S. Panag, J. S. Dhillon

Abstract:

The lifetime of a wireless sensor network can be effectively increased by using scheduling operations. Once the sensors are randomly deployed, the task at hand is to find the largest number of disjoint sets of sensors such that every sensor set provides complete coverage of the target area. At any instant, only one of these disjoint sets is switched on, while all other are switched off. This paper proposes a heuristic search method to find the maximum number of disjoint sets that completely cover the region. A population of randomly initialized members is made to explore the solution space. A set of heuristics has been applied to guide the members to a possible solution in their neighborhood. The heuristics escalate the convergence of the algorithm. The best solution explored by the population is recorded and is continuously updated. The proposed algorithm has been tested for applications which require sensing of multiple target points, referred to as point coverage applications. Results show that the proposed algorithm outclasses the existing algorithms. It always finds the optimum solution, and that too by making fewer number of fitness function evaluations than the existing approaches.

Keywords: coverage, disjoint sets, heuristic, lifetime, scheduling, Wireless sensor networks, WSN

Procedia PDF Downloads 452
12326 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers

Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice

Abstract:

In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.

Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection

Procedia PDF Downloads 446
12325 Reducing Total Harmonic Content of 9-Level Inverter by Use of Cuckoo Algorithm

Authors: Mahmoud Enayati, Sirous Mohammadi

Abstract:

In this paper, a novel procedure to find the firing angles of the multilevel inverters of supply voltage and, consequently, to decline the total harmonic distortion (THD), has been presented. In order to eliminate more harmonics in the multilevel inverters, its number of levels can be lessened or pulse width modulation waveform, in which more than one switching occur in each level, be used. Both cases complicate the non-algebraic equations and their solution cannot be performed by the conventional methods for the numerical solution of nonlinear equations such as Newton-Raphson method. In this paper, Cuckoo algorithm is used to compute the optimal firing angle of the pulse width modulation voltage waveform in the multilevel inverter. These angles should be calculated in such a way that the voltage amplitude of the fundamental frequency be generated while the total harmonic distortion of the output voltage be small. The simulation and theoretical results for the 9-levels inverter offer the high applicability of the proposed algorithm to identify the suitable firing angles for declining the low order harmonics and generate a waveform whose total harmonic distortion is very small and it is almost a sinusoidal waveform.

Keywords: evolutionary algorithms, multilevel inverters, total harmonic content, Cuckoo Algorithm

Procedia PDF Downloads 532
12324 A Case Study of Bee Algorithm for Ready Mixed Concrete Problem

Authors: Wuthichai Wongthatsanekorn, Nuntana Matheekrieangkrai

Abstract:

This research proposes Bee Algorithm (BA) to optimize Ready Mixed Concrete (RMC) truck scheduling problem from single batch plant to multiple construction sites. This problem is considered as an NP-hard constrained combinatorial optimization problem. This paper provides the details of the RMC dispatching process and its related constraints. BA was then developed to minimize total waiting time of RMC trucks while satisfying all constraints. The performance of BA is then evaluated on two benchmark problems (3 and 5construction sites) according to previous researchers. The simulation results of BA are compared in term of efficiency and accuracy with Genetic Algorithm (GA) and all problems show that BA approach outperforms GA in term of efficiency and accuracy to obtain optimal solution. Hence, BA approach could be practically implemented to obtain the best schedule.

Keywords: bee colony optimization, ready mixed concrete problem, ruck scheduling, multiple construction sites

Procedia PDF Downloads 385
12323 A Hybrid Classical-Quantum Algorithm for Boundary Integral Equations of Scattering Theory

Authors: Damir Latypov

Abstract:

A hybrid classical-quantum algorithm to solve boundary integral equations (BIE) arising in problems of electromagnetic and acoustic scattering is proposed. The quantum speed-up is due to a Quantum Linear System Algorithm (QLSA). The original QLSA of Harrow et al. provides an exponential speed-up over the best-known classical algorithms but only in the case of sparse systems. Due to the non-local nature of integral operators, matrices arising from discretization of BIEs, are, however, dense. A QLSA for dense matrices was introduced in 2017. Its runtime as function of the system's size N is bounded by O(√Npolylog(N)). The run time of the best-known classical algorithm for an arbitrary dense matrix scales as O(N².³⁷³). Instead of exponential as in case of sparse matrices, here we have only a polynomial speed-up. Nevertheless, sufficiently high power of this polynomial, ~4.7, should make QLSA an appealing alternative. Unfortunately for the QLSA, the asymptotic separability of the Green's function leads to high compressibility of the BIEs matrices. Classical fast algorithms such as Multilevel Fast Multipole Method (MLFMM) take advantage of this fact and reduce the runtime to O(Nlog(N)), i.e., the QLSA is only quadratically faster than the MLFMM. To be truly impactful for computational electromagnetics and acoustics engineers, QLSA must provide more substantial advantage than that. We propose a computational scheme which combines elements of the classical fast algorithms with the QLSA to achieve the required performance.

Keywords: quantum linear system algorithm, boundary integral equations, dense matrices, electromagnetic scattering theory

Procedia PDF Downloads 154
12322 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm

Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh

Abstract:

this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.

Keywords: genetic algorithm, information retrieval, optimal queries, crossover

Procedia PDF Downloads 292
12321 Evaluation of Real-Time Background Subtraction Technique for Moving Object Detection Using Fast-Independent Component Analysis

Authors: Naoum Abderrahmane, Boumehed Meriem, Alshaqaqi Belal

Abstract:

Background subtraction algorithm is a larger used technique for detecting moving objects in video surveillance to extract the foreground objects from a reference background image. There are many challenges to test a good background subtraction algorithm, like changes in illumination, dynamic background such as swinging leaves, rain, snow, and the changes in the background, for example, moving and stopping of vehicles. In this paper, we propose an efficient and accurate background subtraction method for moving object detection in video surveillance. The main idea is to use a developed fast-independent component analysis (ICA) algorithm to separate background, noise, and foreground masks from an image sequence in practical environments. The fast-ICA algorithm is adapted and adjusted with a matrix calculation and searching for an optimum non-quadratic function to be faster and more robust. Moreover, in order to estimate the de-mixing matrix and the denoising de-mixing matrix parameters, we propose to convert all images to YCrCb color space, where the luma component Y (brightness of the color) gives suitable results. The proposed technique has been verified on the publicly available datasets CD net 2012 and CD net 2014, and experimental results show that our algorithm can detect competently and accurately moving objects in challenging conditions compared to other methods in the literature in terms of quantitative and qualitative evaluations with real-time frame rate.

Keywords: background subtraction, moving object detection, fast-ICA, de-mixing matrix

Procedia PDF Downloads 96
12320 Methaheuristic Bat Algorithm in Training of Feed-Forward Neural Network for Stock Price Prediction

Authors: Marjan Golmaryami, Marzieh Behzadi

Abstract:

Recent developments in stock exchange highlight the need for an efficient and accurate method that helps stockholders make better decision. Since stock markets have lots of fluctuations during the time and different effective parameters, it is difficult to make good decisions. The purpose of this study is to employ artificial neural network (ANN) which can deal with time series data and nonlinear relation among variables to forecast next day stock price. Unlike other evolutionary algorithms which were utilized in stock exchange prediction, we trained our proposed neural network with metaheuristic bat algorithm, with fast and powerful convergence and applied it in stock price prediction for the first time. In order to prove the performance of the proposed method, this research selected a 7 year dataset from Parsian Bank stocks and after imposing data preprocessing, used 3 types of ANN (back propagation-ANN, particle swarm optimization-ANN and bat-ANN) to predict the closed price of stocks. Afterwards, this study engaged MATLAB to simulate 3 types of ANN, with the scoring target of mean absolute percentage error (MAPE). The results may be adapted to other companies stocks too.

Keywords: artificial neural network (ANN), bat algorithm, particle swarm optimization algorithm (PSO), stock exchange

Procedia PDF Downloads 548
12319 Classification Rule Discovery by Using Parallel Ant Colony Optimization

Authors: Waseem Shahzad, Ayesha Tahir Khan, Hamid Hussain Awan

Abstract:

Ant-Miner algorithm that lies under ACO algorithms is used to extract knowledge from data in the form of rules. A variant of Ant-Miner algorithm named as cAnt-MinerPB is used to generate list of rules using pittsburgh approach in order to maintain the rule interaction among the rules that are generated. In this paper, we propose a parallel Ant MinerPB in which Ant colony optimization algorithm runs parallel. In this technique, a data set is divided vertically (i-e attributes) into different subsets. These subsets are created based on the correlation among attributes using Mutual Information (MI). It generates rules in a parallel manner and then merged to form a final list of rules. The results have shown that the proposed technique achieved higher accuracy when compared with original cAnt-MinerPB and also the execution time has also reduced.

Keywords: ant colony optimization, parallel Ant-MinerPB, vertical partitioning, classification rule discovery

Procedia PDF Downloads 295
12318 Optimization Analysis of a Concentric Tube Heat Exchanger with Field Synergy Principle

Authors: M. C. Lin, C. W. Su

Abstract:

The paper investigates the optimization analysis to the heat exchanger design, mainly with response surface method and genetic algorithm to explore the relationship between optimal fluid flow velocity and temperature of the heat exchanger using field synergy principle. First, finite volume method is proposed to calculate the flow temperature and flow rate distribution for numerical analysis. We identify the most suitable simulation equations by response surface methodology. Furthermore, a genetic algorithm approach is applied to optimize the relationship between fluid flow velocity and flow temperature of the heat exchanger. The results show that the field synergy angle plays vital role in the performance of a true heat exchanger.

Keywords: optimization analysis, field synergy, heat exchanger, genetic algorithm

Procedia PDF Downloads 307
12317 A Novel Breast Cancer Detection Algorithm Using Point Region Growing Segmentation and Pseudo-Zernike Moments

Authors: Aileen F. Wang

Abstract:

Mammography has been one of the most reliable methods for early detection and diagnosis of breast cancer. However, mammography misses about 17% and up to 30% of breast cancers due to the subtle and unstable appearances of breast cancer in their early stages. Recent computer-aided diagnosis (CADx) technology using Zernike moments has improved detection accuracy. However, it has several drawbacks: it uses manual segmentation, Zernike moments are not robust, and it still has a relatively high false negative rate (FNR)–17.6%. This project will focus on the development of a novel breast cancer detection algorithm to automatically segment the breast mass and further reduce FNR. The algorithm consists of automatic segmentation of a single breast mass using Point Region Growing Segmentation, reconstruction of the segmented breast mass using Pseudo-Zernike moments, and classification of the breast mass using the root mean square (RMS). A comparative study among the various algorithms on the segmentation and reconstruction of breast masses was performed on randomly selected mammographic images. The results demonstrated that the newly developed algorithm is the best in terms of accuracy and cost effectiveness. More importantly, the new classifier RMS has the lowest FNR–6%.

Keywords: computer aided diagnosis, mammography, point region growing segmentation, pseudo-zernike moments, root mean square

Procedia PDF Downloads 452
12316 Faulty Sensors Detection in Planar Array Antenna Using Pelican Optimization Algorithm

Authors: Shafqat Ullah Khan, Ammar Nasir

Abstract:

Using planar antenna array (PAA) in radars, Broadcasting, satellite antennas, and sonar for the detection of targets, Helps provide instant beam pattern control. High flexibility and Adaptability are achieved by multiple beam steering by using a Planar array and are particularly needed in real-life Sanrio’s where the need arises for several high-directivity beams. Faulty sensors in planar arrays generate asymmetry, which leads to service degradation, radiation pattern distortion, and increased levels of sidelobe. The POA, a nature-inspired optimization algorithm, accurately determines faulty sensors within an array, enhancing the reliability and performance of planar array antennas through extensive simulations and experiments. The analysis was done for different types of faults in 7 x 7 and 8 x 8 planar arrays in MATLAB.

Keywords: Planar antenna array, , Pelican optimisation Algorithm, , Faculty sensor, Antenna arrays

Procedia PDF Downloads 79
12315 Working Capital Management Effectiveness

Authors: Asif Iqbal

Abstract:

Working capital management has its effect on liquidity as well as on profitability of a firm. In this research we have selected a sample of 100 respondents whose firms are listed on Karachi stock exchange. We have studied the effect of different variable s of working capital management. We find that organizations throughout the world as well as in Pakistan have to give immense recognition to the working capital management as it is an effective thing from their long term perspective especially to their shareholders to have a firm confidence over the companies for investment purpose.

Keywords: working capital management, Karachi stock exchange, shareholders, capital management

Procedia PDF Downloads 575
12314 Real-Time Data Stream Partitioning over a Sliding Window in Real-Time Spatial Big Data

Authors: Sana Hamdi, Emna Bouazizi, Sami Faiz

Abstract:

In recent years, real-time spatial applications, like location-aware services and traffic monitoring, have become more and more important. Such applications result dynamic environments where data as well as queries are continuously moving. As a result, there is a tremendous amount of real-time spatial data generated every day. The growth of the data volume seems to outspeed the advance of our computing infrastructure. For instance, in real-time spatial Big Data, users expect to receive the results of each query within a short time period without holding in account the load of the system. But with a huge amount of real-time spatial data generated, the system performance degrades rapidly especially in overload situations. To solve this problem, we propose the use of data partitioning as an optimization technique. Traditional horizontal and vertical partitioning can increase the performance of the system and simplify data management. But they remain insufficient for real-time spatial Big data; they can’t deal with real-time and stream queries efficiently. Thus, in this paper, we propose a novel data partitioning approach for real-time spatial Big data named VPA-RTSBD (Vertical Partitioning Approach for Real-Time Spatial Big data). This contribution is an implementation of the Matching algorithm for traditional vertical partitioning. We find, firstly, the optimal attribute sequence by the use of Matching algorithm. Then, we propose a new cost model used for database partitioning, for keeping the data amount of each partition more balanced limit and for providing a parallel execution guarantees for the most frequent queries. VPA-RTSBD aims to obtain a real-time partitioning scheme and deals with stream data. It improves the performance of query execution by maximizing the degree of parallel execution. This affects QoS (Quality Of Service) improvement in real-time spatial Big Data especially with a huge volume of stream data. The performance of our contribution is evaluated via simulation experiments. The results show that the proposed algorithm is both efficient and scalable, and that it outperforms comparable algorithms.

Keywords: real-time spatial big data, quality of service, vertical partitioning, horizontal partitioning, matching algorithm, hamming distance, stream query

Procedia PDF Downloads 157
12313 Redefining Problems and Challenges of Natural Resource Management in Indonesia

Authors: Amalia Zuhra

Abstract:

Indonesia is very rich with its natural resources. Natural resource management becomes a challenge for Indonesia. Improper management will make the natural resources run out and future generations will not be able to enjoy the natural wealth. A good rule of law and proper implementation determines the success of the management of a country's natural resources. This paper examines the need to redefine problems and challenges in the management of natural resources in Indonesia in the context of law. The purpose of this article is to overview the latest issues and challenges in natural resource management and to redefine legal provisions related to environmental management and human rights protection so that the management of natural resources in the present and future will be more sustainable. This paper finds that sustainable management of natural resources is absolutely essential. The aspect of environmental protection and human rights must be elaborated more deeply so that the management of natural resources can be done maximally without harming not only people but also the environment.

Keywords: international environmental law, human rights law, natural resource management, sustainable development

Procedia PDF Downloads 274
12312 Design of Digital IIR Filter Using Opposition Learning and Artificial Bee Colony Algorithm

Authors: J. S. Dhillon, K. K. Dhaliwal

Abstract:

In almost all the digital filtering applications the digital infinite impulse response (IIR) filters are preferred over finite impulse response (FIR) filters because they provide much better performance, less computational cost and have smaller memory requirements for similar magnitude specifications. However, the digital IIR filters are generally multimodal with respect to the filter coefficients and therefore, reliable methods that can provide global optimal solutions are required. The artificial bee colony (ABC) algorithm is one such recently introduced meta-heuristic optimization algorithm. But in some cases it shows insufficiency while searching the solution space resulting in a weak exchange of information and hence is not able to return better solutions. To overcome this deficiency, the opposition based learning strategy is incorporated in ABC and hence a modified version called oppositional artificial bee colony (OABC) algorithm is proposed in this paper. Duplication of members is avoided during the run which also augments the exploration ability. The developed algorithm is then applied for the design of optimal and stable digital IIR filter structure where design of low-pass (LP) and high-pass (HP) filters is carried out. Fuzzy theory is applied to achieve maximize satisfaction of minimum magnitude error and stability constraints. To check the effectiveness of OABC, the results are compared with some well established filter design techniques and it is observed that in most cases OABC returns better or atleast comparable results.

Keywords: digital infinite impulse response filter, artificial bee colony optimization, opposition based learning, digital filter design, multi-parameter optimization

Procedia PDF Downloads 477
12311 Generation of Photo-Mosaic Images through Block Matching and Color Adjustment

Authors: Hae-Yeoun Lee

Abstract:

Mosaic refers to a technique that makes image by gathering lots of small materials in various colours. This paper presents an automatic algorithm that makes the photomosaic image using photos. The algorithm is composed of four steps: Partition and feature extraction, block matching, redundancy removal and colour adjustment. The input image is partitioned in the small block to extract feature. Each block is matched to find similar photo in database by comparing similarity with Euclidean difference between blocks. The intensity of the block is adjusted to enhance the similarity of image by replacing the value of light and darkness with that of relevant block. Further, the quality of image is improved by minimizing the redundancy of tiles in the adjacent blocks. Experimental results support that the proposed algorithm is excellent in quantitative analysis and qualitative analysis.

Keywords: photomosaic, Euclidean distance, block matching, intensity adjustment

Procedia PDF Downloads 278
12310 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters

Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu

Abstract:

Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.

Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning

Procedia PDF Downloads 200
12309 Flood Monitoring in the Vietnamese Mekong Delta Using Sentinel-1 SAR with Global Flood Mapper

Authors: Ahmed S. Afifi, Ahmed Magdy

Abstract:

Satellite monitoring is an essential tool to study, understand, and map large-scale environmental changes that affect humans, climate, and biodiversity. The Sentinel-1 Synthetic Aperture Radar (SAR) instrument provides a high collection of data in all-weather, short revisit time, and high spatial resolution that can be used effectively in flood management. Floods occur when an overflow of water submerges dry land that requires to be distinguished from flooded areas. In this study, we use global flood mapper (GFM), a new google earth engine application that allows users to quickly map floods using Sentinel-1 SAR. The GFM enables the users to adjust manually the flood map parameters, e.g., the threshold for Z-value for VV and VH bands and the elevation and slope mask threshold. The composite R:G:B image results by coupling the bands of Sentinel-1 (VH:VV:VH) reduces false classification to a large extent compared to using one separate band (e.g., VH polarization band). The flood mapping algorithm in the GFM and the Otsu thresholding are compared with Sentinel-2 optical data. And the results show that the GFM algorithm can overcome the misclassification of a flooded area in An Giang, Vietnam.

Keywords: SAR backscattering, Sentinel-1, flood mapping, disaster

Procedia PDF Downloads 105
12308 Modification of Rk Equation of State for Liquid and Vapor of Ammonia by Genetic Algorithm

Authors: S. Mousavian, F. Mousavian, V. Nikkhah Rashidabad

Abstract:

Cubic equations of state like Redlich–Kwong (RK) EOS have been proved to be very reliable tools in the prediction of phase behavior. Despite their good performance in compositional calculations, they usually suffer from weaknesses in the predictions of saturated liquid density. In this research, RK equation was modified. The result of this study shows that modified equation has good agreement with experimental data.

Keywords: equation of state, modification, ammonia, genetic algorithm

Procedia PDF Downloads 382
12307 Improved Multi–Objective Firefly Algorithms to Find Optimal Golomb Ruler Sequences for Optimal Golomb Ruler Channel Allocation

Authors: Shonak Bansal, Prince Jain, Arun Kumar Singh, Neena Gupta

Abstract:

Recently nature–inspired algorithms have widespread use throughout the tough and time consuming multi–objective scientific and engineering design optimization problems. In this paper, we present extended forms of firefly algorithm to find optimal Golomb ruler (OGR) sequences. The OGRs have their one of the major application as unequally spaced channel–allocation algorithm in optical wavelength division multiplexing (WDM) systems in order to minimize the adverse four–wave mixing (FWM) crosstalk effect. The simulation results conclude that the proposed optimization algorithm has superior performance compared to the existing conventional computing and nature–inspired optimization algorithms to find OGRs in terms of ruler length, total optical channel bandwidth and computation time.

Keywords: channel allocation, conventional computing, four–wave mixing, nature–inspired algorithm, optimal Golomb ruler, lévy flight distribution, optimization, improved multi–objective firefly algorithms, Pareto optimal

Procedia PDF Downloads 320
12306 Stakeholder Management for Successful Software Projects

Authors: Kassem Saleh

Abstract:

An alarming number of software projects fail to deliver the required functionalities within the provided budget and timeframe and with the required qualities. Some of the main reasons for this problem include bad stakeholder management, poor communications and informal change management. Informal processes to identify, engage and control stakeholders lead to these reasons. Recently, to emphasize its importance, the Project Management Institute (PMI) updated the Project Management Body of Knowledge (PMBoK) to explicitly include the stakeholder management knowledge area. This knowledge area consists of four processes to identify stakeholders, plan stakeholder management, and manage and control stakeholder engagement. The use of appropriate techniques for stakeholder management in software projects will definitely lead to higher quality and successful software. In this paper, we describe some of the proven techniques that can be used during the execution of the four processes for stakeholder management. Development of collaboration tools for automating these processes are recommended and need to be integrated in available software project management tools.

Keywords: project management, stakeholder management, software development, project management body of knowledge

Procedia PDF Downloads 311
12305 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 290
12304 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic

Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi

Abstract:

In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.

Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing

Procedia PDF Downloads 299
12303 Iterative Method for Lung Tumor Localization in 4D CT

Authors: Sarah K. Hagi, Majdi Alnowaimi

Abstract:

In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.

Keywords: automated algorithm , computed tomography, lung tumor, tumor localization

Procedia PDF Downloads 602
12302 Induction Motor Eccentricity Fault Recognition Using Rotor Slot Harmonic with Stator Current Technique

Authors: Nouredine Benouzza, Ahmed Hamida Boudinar, Azeddine Bendiabdellah

Abstract:

An algorithm for Eccentricity Fault Detection (EFD) applied to a squirrel cage induction machine is proposed in this paper. This algorithm employs the behavior of the stator current spectral analysis and the localization of the Rotor Slot Harmonic (RSH) frequency to detect eccentricity faults in three phase induction machine. The RHS frequency once obtained is used as a key parameter into a simple developed expression to directly compute the eccentricity fault frequencies in the induction machine. Experimental tests performed for both a healthy motor and a faulty motor with different eccentricity fault severities illustrate the effectiveness and merits of the proposed EFD algorithm.

Keywords: squirrel cage motor, diagnosis, eccentricity faults, current spectral analysis, rotor slot harmonic

Procedia PDF Downloads 487