Search results for: fruit fly optimization algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6694

Search results for: fruit fly optimization algorithm

5044 Second Order Cone Optimization Approach to Two-stage Network DEA

Authors: K. Asanimoghadam, M. Salahi, A. Jamalian

Abstract:

Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.

Keywords: network DEA, conic optimization, undesirable output, SBM

Procedia PDF Downloads 193
5043 Efficient Feature Fusion for Noise Iris in Unconstrained Environment

Authors: Yao-Hong Tsai

Abstract:

This paper presents an efficient fusion algorithm for iris images to generate stable feature for recognition in unconstrained environment. Recently, iris recognition systems are focused on real scenarios in our daily life without the subject’s cooperation. Under large variation in the environment, the objective of this paper is to combine information from multiple images of the same iris. The result of image fusion is a new image which is more stable for further iris recognition than each original noise iris image. A wavelet-based approach for multi-resolution image fusion is applied in the fusion process. The detection of the iris image is based on Adaboost algorithm and then local binary pattern (LBP) histogram is then applied to texture classification with the weighting scheme. Experiment showed that the generated features from the proposed fusion algorithm can improve the performance for verification system through iris recognition.

Keywords: image fusion, iris recognition, local binary pattern, wavelet

Procedia PDF Downloads 366
5042 A Parallel Implementation of k-Means in MATLAB

Authors: Dimitris Varsamis, Christos Talagkozis, Alkiviadis Tsimpiris, Paris Mastorocostas

Abstract:

The aim of this work is the parallel implementation of k-means in MATLAB, in order to reduce the execution time. Specifically, a new function in MATLAB for serial k-means algorithm is developed, which meets all the requirements for the conversion to a function in MATLAB with parallel computations. Additionally, two different variants for the definition of initial values are presented. In the sequel, the parallel approach is presented. Finally, the performance tests for the computation times respect to the numbers of features and classes are illustrated.

Keywords: K-means algorithm, clustering, parallel computations, Matlab

Procedia PDF Downloads 383
5041 Oil Pollution Analysis of the Ecuadorian Rainforest Using Remote Sensing Methods

Authors: Juan Heredia, Naci Dilekli

Abstract:

The Ecuadorian Rainforest has been polluted for almost 60 years with little to no regard to oversight, law, or regulations. The consequences have been vast environmental damage such as pollution and deforestation, as well as sickness and the death of many people and animals. The aim of this paper is to quantify and localize the polluted zones, which something that has not been conducted and is the first step for remediation. To approach this problem, multi-spectral Remote Sensing imagery was utilized using a novel algorithm developed for this study, based on four normalized indices available in the literature. The algorithm classifies the pixels in polluted or healthy ones. The results of this study include a new algorithm for pixel classification and quantification of the polluted area in the selected image. Those results were finally validated by ground control points found in the literature. The main conclusion of this work is that using hyperspectral images, it is possible to identify polluted vegetation. The future work is environmental remediation, in-situ tests, and more extensive results that would inform new policymaking.

Keywords: remote sensing, oil pollution quatification, amazon forest, hyperspectral remote sensing

Procedia PDF Downloads 161
5040 Biodiversity Interactions Between C3 and C4 Plants under Agroforestry Cropping System

Authors: Ezzat Abd El Lateef

Abstract:

Agroforestry means combining the management of trees with productive agricultural activities, especially in semiarid regions where crop yield increases are limited in agroforestry systems due to the fertility and microclimate improvements and the large competitive effect of trees with crops for water and nutrients, in order to assess the effect of agroforestry of some field crops with citrus trees as an approach to establish biodiversity in fruit tree plantations. Three field crops, i.e., maize, soybean and sunflower, were inter-planted with seedless orange trees (4*4 m) or were planted as solid plantings. The results for the trees indicated a larger fruit yield was obtained when soybean and sunflowers were interplant with citrus. Statistically significant effects (P<0.05) were found for maize grain and biological yields, with increased yields when grown as solid planting. There were no differences in the yields of soya bean and sunflower, where the yields were very similar between the two cropping systems. It is evident from the trials that agroforestry is an efficient concept to increase biodiversity through the interaction of trees with the interplant field crop species. Maize, unlike the other crops, was more sensitive to shade conditions under agroforestry practice and not preferred in the biodiversity system. The potential of agroforestry to improve or increase biodiversity is efficient as the understorey crops are usually C4 species, and the overstorey trees are invariably C3 species in agroforestry. Improvement in interplant species is most likely if the understorey crop is a C3 species, which are usually light saturated in the open, and partial shade may have little effect on assimilation or by a concurrent reduction in transpiration. It could be concluded that agroforestry is an efficient concept to increase biodiversity through the interaction of trees with the interplant field crop species. Some field crops could be employed successfully, like soybean or sunflowers, while others like maize are sensitive to incorporate in agroforestry system.

Keywords: agroforestry, field crops, C3 and C4 plants, yield

Procedia PDF Downloads 181
5039 Hedonic Price Analysis of Consumer Preference for Musa spp in Northern Nigeria

Authors: Yakubu Suleiman, S. A. Musa

Abstract:

The research was conducted to determine the physical characteristics of banana fruits that influenced consumer preferences for the fruit in Northern Nigeria. Socio-economic characteristics of the respondents were also identified. Simple descriptive statistics and Hedonic prices model were used to analyze the data collected for socio-economic and consumer preference respectively with the aid of 1000 structured questionnaires. The result revealed the value of R2 to be 0.633, meaning that, 63.3% of the variation in the banana price was brought about by the explanatory variables included in the model and the variables are: colour, size, degree of ripeness, softness, surface blemish, cleanliness of the fruits, weight, length, and cluster size of fruits. However, the remaining 36.7% could be attributed to the error term or random disturbance in the model. It could also be seen from the calculated result that the intercept was 1886.5 and was statistically significant (P < 0.01), meaning that about N1886.5 worth of banana fruits could be bought by consumers without considering the variables of banana included in the model. Moreover, consumers showed that they have significant preference for colours, size, degree of ripeness, softness, weight, length and cluster size of banana fruits and they were tested to be significant at either P < 0.01, P < 0.05, and P < 0.1 . Moreover, the result also shows that consumers did not show significance preferences to surface blemish, cleanliness and variety of the banana fruit as all of them showed non-significance level with negative signs. Based on the findings of the research, it is hereby recommended that plant breeders and research institutes should concentrate on the production of banana fruits that have those physical characteristics that were found to be statistically significance like cluster size, degree of ripeness,’ softness, length, size, and skin colour.

Keywords: analysis, consumers, preference, variables

Procedia PDF Downloads 341
5038 Multi-Objective Optimization of the Thermal-Hydraulic Behavior for a Sodium Fast Reactor with a Gas Power Conversion System and a Loss of off-Site Power Simulation

Authors: Avent Grange, Frederic Bertrand, Jean-Baptiste Droin, Amandine Marrel, Jean-Henry Ferrasse, Olivier Boutin

Abstract:

CEA and its industrial partners are designing a gas Power Conversion System (PCS) based on a Brayton cycle for the ASTRID Sodium-cooled Fast Reactor. Investigations of control and regulation requirements to operate this PCS during operating, incidental and accidental transients are necessary to adapt core heat removal. To this aim, we developed a methodology to optimize the thermal-hydraulic behavior of the reactor during normal operations, incidents and accidents. This methodology consists of a multi-objective optimization for a specific sequence, whose aim is to increase component lifetime by reducing simultaneously several thermal stresses and to bring the reactor into a stable state. Furthermore, the multi-objective optimization complies with safety and operating constraints. Operating, incidental and accidental sequences use specific regulations to control the thermal-hydraulic reactor behavior, each of them is defined by a setpoint, a controller and an actuator. In the multi-objective problem, the parameters used to solve the optimization are the setpoints and the settings of the controllers associated with the regulations included in the sequence. In this way, the methodology allows designers to define an optimized and specific control strategy of the plant for the studied sequence and hence to adapt PCS piloting at its best. The multi-objective optimization is performed by evolutionary algorithms coupled to surrogate models built on variables computed by the thermal-hydraulic system code, CATHARE2. The methodology is applied to a loss of off-site power sequence. Three variables are controlled: the sodium outlet temperature of the sodium-gas heat exchanger, turbomachine rotational speed and water flow through the heat sink. These regulations are chosen in order to minimize thermal stresses on the gas-gas heat exchanger, on the sodium-gas heat exchanger and on the vessel. The main results of this work are optimal setpoints for the three regulations. Moreover, Proportional-Integral-Derivative (PID) control setting is considered and efficient actuators used in controls are chosen through sensitivity analysis results. Finally, the optimized regulation system and the reactor control procedure, provided by the optimization process, are verified through a direct CATHARE2 calculation.

Keywords: gas power conversion system, loss of off-site power, multi-objective optimization, regulation, sodium fast reactor, surrogate model

Procedia PDF Downloads 306
5037 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster

Authors: Trapti Sharma, Devesh Kumar Srivastava

Abstract:

This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.

Keywords: hadoop, mapreduce, k-mediod, validation, verification

Procedia PDF Downloads 365
5036 Fingerprint Image Encryption Using a 2D Chaotic Map and Elliptic Curve Cryptography

Authors: D. M. S. Bandara, Yunqi Lei, Ye Luo

Abstract:

Fingerprints are suitable as long-term markers of human identity since they provide detailed and unique individual features which are difficult to alter and durable over life time. In this paper, we propose an algorithm to encrypt and decrypt fingerprint images by using a specially designed Elliptic Curve Cryptography (ECC) procedure based on block ciphers. In addition, to increase the confusing effect of fingerprint encryption, we also utilize a chaotic-behaved method called Arnold Cat Map (ACM) for a 2D scrambling of pixel locations in our method. Experimental results are carried out with various types of efficiency and security analyses. As a result, we demonstrate that the proposed fingerprint encryption/decryption algorithm is advantageous in several different aspects including efficiency, security and flexibility. In particular, using this algorithm, we achieve a margin of about 0.1% in the test of Number of Pixel Changing Rate (NPCR) values comparing to the-state-of-the-art performances.

Keywords: arnold cat map, biometric encryption, block cipher, elliptic curve cryptography, fingerprint encryption, Koblitz’s encoding

Procedia PDF Downloads 203
5035 HR MRI CS Based Image Reconstruction

Authors: Krzysztof Malczewski

Abstract:

Magnetic Resonance Imaging (MRI) reconstruction algorithm using compressed sensing is presented in this paper. It is exhibited that the offered approach improves MR images spatial resolution in circumstances when highly undersampled k-space trajectories are applied. Compressed Sensing (CS) aims at signal and images reconstructing from significantly fewer measurements than were conventionally assumed necessary. Magnetic Resonance Imaging (MRI) is a fundamental medical imaging method struggles with an inherently slow data acquisition process. The use of CS to MRI has the potential for significant scan time reductions, with visible benefits for patients and health care economics. In this study the objective is to combine super-resolution image enhancement algorithm with CS framework benefits to achieve high resolution MR output image. Both methods emphasize on maximizing image sparsity on known sparse transform domain and minimizing fidelity. The presented algorithm considers the cardiac and respiratory movements.

Keywords: super-resolution, MRI, compressed sensing, sparse-sense, image enhancement

Procedia PDF Downloads 428
5034 Triangulations via Iterated Largest Angle Bisection

Authors: Yeonjune Kang

Abstract:

A triangulation of a planar region is a partition of that region into triangles. In the finite element method, triangulations are often used as the grid underlying a computation. In order to be suitable as a finite element mesh, a triangulation must have well-shaped triangles, according to criteria that depend on the details of the particular problem. For instance, most methods require that all triangles be small and as close to the equilateral shape as possible. Stated differently, one wants to avoid having either thin or flat triangles in the triangulation. There are many triangulation procedures, a particular one being the one known as the longest edge bisection algorithm described below. Starting with a given triangle, locate the midpoint of the longest edge and join it to the opposite vertex of the triangle. Two smaller triangles are formed; apply the same bisection procedure to each of these triangles. Continuing in this manner after n steps one obtains a triangulation of the initial triangle into 2n smaller triangles. The longest edge algorithm was first considered in the late 70’s. It was shown by various authors that this triangulation has the desirable properties for the finite element method: independently of the number of iterations the angles of these triangles cannot get too small; moreover, the size of the triangles decays exponentially. In the present paper we consider a related triangulation algorithm we refer to as the largest angle bisection procedure. As the name suggests, rather than bisecting the longest edge, at each step we bisect the largest angle. We study the properties of the resulting triangulation and prove that, while the general behavior resembles the one in the longest edge bisection algorithm, there are several notable differences as well.

Keywords: angle bisectors, geometry, triangulation, applied mathematics

Procedia PDF Downloads 400
5033 Traditional Drawing, BIM and Erudite Design Process

Authors: Maryam Kalkatechi

Abstract:

Nowadays, parametric design, scientific analysis, and digital fabrication are dominant. Many architectural practices are increasingly seeking to incorporate advanced digital software and fabrication in their projects. Proposing an erudite design process that combines digital and practical aspects in a strong frame within the method was resulted from the dissertation research. The digital aspects are the progressive advancements in algorithm design and simulation software. These aspects have assisted the firms to develop more holistic concepts at the early stage and maintain collaboration among disciplines during the design process. The erudite design process enhances the current design processes by encouraging the designer to implement the construction and architecture knowledge within the algorithm to make successful design processes. The erudite design process also involves the ongoing improvements of applying the new method of 3D printing in construction. This is achieved through the ‘data-sketches’. The term ‘data-sketch’ was developed by the author in the dissertation that was recently completed. It accommodates the decisions of the architect on the algorithm. This paper introduces the erudite design process and its components. It will summarize the application of this process in development of the ‘3D printed construction unit’. This paper contributes to overlaying the academic and practice with advanced technology by presenting a design process that transfers the dominance of tool to the learned architect and encourages innovation in design processes.

Keywords: erudite, data-sketch, algorithm design in architecture, design process

Procedia PDF Downloads 273
5032 Optimization of Black-Litterman Model for Portfolio Assets Allocation

Authors: A. Hidalgo, A. Desportes, E. Bonin, A. Kadaoui, T. Bouaricha

Abstract:

Present paper is concerned with portfolio management with Black-Litterman (B-L) model. Considered stocks are exclusively limited to large companies stocks on US market. Results obtained by application of the model are presented. From analysis of collected Dow Jones stock data, remarkable explicit analytical expression of optimal B-L parameter τ, which scales dispersion of normal distribution of assets mean return, is proposed in terms of standard deviation of covariance matrix. Implementation has been developed in Matlab environment to split optimization in Markovitz sense from specific elements related to B-L representation.

Keywords: Black-Litterman, Markowitz, market data, portfolio manager opinion

Procedia PDF Downloads 259
5031 Facial Biometric Privacy Using Visual Cryptography: A Fundamental Approach to Enhance the Security of Facial Biometric Data

Authors: Devika Tanna

Abstract:

'Biometrics' means 'life measurement' but the term is usually associated with the use of unique physiological characteristics to identify an individual. It is important to secure the privacy of digital face image that is stored in central database. To impart privacy to such biometric face images, first, the digital face image is split into two host face images such that, each of it gives no idea of existence of the original face image and, then each cover image is stored in two different databases geographically apart. When both the cover images are simultaneously available then only we can access that original image. This can be achieved by using the XM2VTS and IMM face database, an adaptive algorithm for spatial greyscale. The algorithm helps to select the appropriate host images which are most likely to be compatible with the secret image stored in the central database based on its geometry and appearance. The encryption is done using GEVCS which results in a reconstructed image identical to the original private image.

Keywords: adaptive algorithm, database, host images, privacy, visual cryptography

Procedia PDF Downloads 129
5030 A Non-Parametric Based Mapping Algorithm for Use in Audio Fingerprinting

Authors: Analise Borg, Paul Micallef

Abstract:

Over the past few years, the online multimedia collection has grown at a fast pace. Several companies showed interest to study the different ways to organize the amount of audio information without the need of human intervention to generate metadata. In the past few years, many applications have emerged on the market which are capable of identifying a piece of music in a short time. Different audio effects and degradation make it much harder to identify the unknown piece. In this paper, an audio fingerprinting system which makes use of a non-parametric based algorithm is presented. Parametric analysis is also performed using Gaussian Mixture Models (GMMs). The feature extraction methods employed are the Mel Spectrum Coefficients and the MPEG-7 basic descriptors. Bin numbers replaced the extracted feature coefficients during the non-parametric modelling. The results show that non-parametric analysis offer potential results as the ones mentioned in the literature.

Keywords: audio fingerprinting, mapping algorithm, Gaussian Mixture Models, MFCC, MPEG-7

Procedia PDF Downloads 419
5029 Optimization of the Flexural Strength of Biocomposites Samples Reinforced with Resin for Engineering Applications

Authors: Stephen Akong Takim

Abstract:

This study focused on the optimization of the flexural strength of bio-composite samples of palm kernel, whelks, clams, periwinkles shells and bamboo fiber reinforced with resin for engineering applications. The aim of the study was to formulate different samples of bio-composite reinforced with resin for engineering applications and to evaluate the flexural strength of the fabricated composite. The hand lay-up technique was used for the composites produced by incorporating different percentage compositions of the shells/fiber (10%, 15%, 20%, 25% and 30%) into varied proportions of epoxy resin and catalyst. The cured samples, after 24 hours, were subjected to tensile, impact, flexural and water absorption tests. The experiments were conducted using the Taguchi optimization method L25 (5x5) with five design parameters and five level combinations in Minitab 18 statistical software. The results showed that the average value of flexural was 114.87MPa when compared to the unreinforced 72.33MPa bio-composite. The study recommended that agricultural waste, like palm kernel shells, whelk shells, clams, periwinkle shells and bamboo fiber, should be converted into important engineering applications.

Keywords: bio-composite, resin, palm kernel shells, welk shells, periwinkle shells, bamboo fiber, Taguchi techniques and engineering application

Procedia PDF Downloads 73
5028 High Pressure Processing of Jackfruit Bulbs: Effect on Color, Nutrient Profile and Enzyme Inactivation

Authors: Jyoti Kumari, Pavuluri Srinivasa Rao

Abstract:

Jackfruit (ArtocarpusheterophyllusL.) is an underutilized yet highly nutritious fruit with unique flavour, known for its therapeutic and culinary properties. Fresh jackfruit bulb has a very short shelf life due to high moisture and sugar content leading to microbial and enzymatic browning, hindering its consumer acceptability and marketability. An attempt has been made for the preservation of the ripe jackfruit bulbs, by the application of high pressure (HP) over a range of 200-500 MPa at ambient temperature for dwell times ranging from 5 to 20 min. The physicochemical properties of jackfruit bulbs such as the pH, TSS, and titrable acidity were not affected by the pressurization process. The ripening index of the fruit bulb also decreased following HP treatment. While the ascorbic acid and antioxidant activity of jackfruit bulb were well retained by high pressure processing (HPP), the total phenols and carotenoids showed a slight increase. The HPP significantly affected the colour and textural properties of jackfruit bulb. High pressure processing was highly effective in reducing the browning index of jackfruit bulbs in comparison to untreated bulbs. The firmness of the bulbs improved upon the pressure treatment with longer dwelling time. The polyphenol oxidase has been identified as the most prominent oxidative enzyme in the jackfruit bulb. The enzymatic activity of polyphenol oxidase and peroxidase were significantly reduced by up to 40% following treatment at 400 MPa/15 min. HPP of jackfruit bulbs at ambient temperatures is shown to be highly beneficial in improving the shelf stability, retaining its nutrient profile, color, and appearance while ensuring the maximum inactivation of the spoilage enzymes.

Keywords: antioxidant capacity, ascorbic acid, carotenoids, color, HPP-high pressure processing, jackfruit bulbs, polyphenol oxidase, peroxidase, total phenolic content

Procedia PDF Downloads 172
5027 Efficient Ground Targets Detection Using Compressive Sensing in Ground-Based Synthetic-Aperture Radar (SAR) Images

Authors: Gherbi Nabil

Abstract:

Detection of ground targets in SAR radar images is an important area for radar information processing. In the literature, various algorithms have been discussed in this context. However, most of them are of low robustness and accuracy. To this end, we discuss target detection in SAR images based on compressive sensing. Firstly, traditional SAR image target detection algorithms are discussed, and their limitations are highlighted. Secondly, a compressive sensing method is proposed based on the sparsity of SAR images. Next, the detection problem is solved using Multiple Measurements Vector configuration. Furthermore, a robust Alternating Direction Method of Multipliers (ADMM) is developed to solve the optimization problem. Finally, the detection results obtained using raw complex data are presented. Experimental results on real SAR images have verified the effectiveness of the proposed algorithm.

Keywords: compressive sensing, raw complex data, synthetic aperture radar, ADMM

Procedia PDF Downloads 17
5026 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric

Procedia PDF Downloads 167
5025 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 469
5024 New Segmentation of Piecewise Linear Regression Models Using Reversible Jump MCMC Algorithm

Authors: Suparman

Abstract:

Piecewise linear regression models are very flexible models for modeling the data. If the piecewise linear regression models are matched against the data, then the parameters are generally not known. This paper studies the problem of parameter estimation of piecewise linear regression models. The method used to estimate the parameters of picewise linear regression models is Bayesian method. But the Bayes estimator can not be found analytically. To overcome these problems, the reversible jump MCMC algorithm is proposed. Reversible jump MCMC algorithm generates the Markov chain converges to the limit distribution of the posterior distribution of the parameters of picewise linear regression models. The resulting Markov chain is used to calculate the Bayes estimator for the parameters of picewise linear regression models.

Keywords: regression, piecewise, Bayesian, reversible Jump MCMC

Procedia PDF Downloads 519
5023 Improving Fused Deposition Modeling Efficiency: A Parameter Optimization Approach

Authors: Wadea Ameen

Abstract:

Rapid prototyping (RP) technology, such as fused deposition modeling (FDM), is gaining popularity because it can produce functioning components with intricate geometric patterns in a reasonable amount of time. A multitude of process variables influences the quality of manufactured parts. In this study, four important process parameters such as layer thickness, model interior fill style, support fill style and orientation are considered. Their influence on three responses, such as build time, model material, and support material, is studied. Experiments are conducted based on factorial design, and the results are presented.

Keywords: fused deposition modeling, factorial design, optimization, 3D printing

Procedia PDF Downloads 19
5022 Optimization of Machining Parameters by Using Cryogenic Media

Authors: Shafqat Wahab, Waseem Tahir, Manzoor Ahmad, Sarfraz Khan, M. Azam

Abstract:

Optimization and analysis of tool flank wear width and surface finish of alloy steel rods are studied in the presence of cryogenic media (LN2) by using Tungsten Carbide Insert (CNMG 120404- WF 4215). Robust design concept of Taguchi L9(34) method and ANOVA is applied to determine the contribution of key cutting parameters and their optimum conditions. Through analysis, it revealed that cryogenic impact is more significant in reduction of the tool flank wear width while surface finish is mostly dependent on feed rate.

Keywords: turning, cryogenic fluid, liquid nitrogen, flank wear, surface roughness, taguchi

Procedia PDF Downloads 665
5021 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis

Authors: Tawfik Thelaidjia, Salah Chenikher

Abstract:

Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approach

Keywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement

Procedia PDF Downloads 436
5020 The Reduction of CO2 Emissions Level in Malaysian Transportation Sector: An Optimization Approach

Authors: Siti Indati Mustapa, Hussain Ali Bekhet

Abstract:

Transportation sector represents more than 40% of total energy consumption in Malaysia. This sector is a major user of fossils based fuels, and it is increasingly being highlighted as the sector which contributes least to CO2 emission reduction targets. Considering this fact, this paper attempts to investigate the problem of reducing CO2 emission using linear programming approach. An optimization model which is used to investigate the optimal level of CO2 emission reduction in the road transport sector is presented. In this paper, scenarios have been used to demonstrate the emission reduction model: (1) utilising alternative fuel scenario, (2) improving fuel efficiency scenario, (3) removing fuel subsidy scenario, (4) reducing demand travel, (5) optimal scenario. This study finds that fuel balancing can contribute to the reduction of the amount of CO2 emission by up to 3%. Beyond 3% emission reductions, more stringent measures that include fuel switching, fuel efficiency improvement, demand travel reduction and combination of mitigation measures have to be employed. The model revealed that the CO2 emission reduction in the road transportation can be reduced by 38.3% in the optimal scenario.

Keywords: CO2 emission, fuel consumption, optimization, linear programming, transportation sector, Malaysia

Procedia PDF Downloads 421
5019 A Memetic Algorithm for an Energy-Costs-Aware Flexible Job-Shop Scheduling Problem

Authors: Christian Böning, Henrik Prinzhorn, Eric C. Hund, Malte Stonis

Abstract:

In this article, the flexible job-shop scheduling problem is extended by consideration of energy costs which arise owing to the power peak, and further decision variables such as work in process and throughput time are incorporated into the objective function. This enables a production plan to be simultaneously optimized in respect of the real arising energy and logistics costs. The energy-costs-aware flexible job-shop scheduling problem (EFJSP) which arises is described mathematically, and a memetic algorithm (MA) is presented as a solution. In the MA, the evolutionary process is supplemented with a local search. Furthermore, repair procedures are used in order to rectify any infeasible solutions that have arisen in the evolutionary process. The potential for lowering the real arising costs of a production plan through consideration of energy consumption levels is highlighted.

Keywords: energy costs, flexible job-shop scheduling, memetic algorithm, power peak

Procedia PDF Downloads 343
5018 Inhibition of the Activity of Polyphenol Oxidase Enzyme Present in Annona muricata and Musa acuminata by the Experimentally Identified Natural Anti-Browning Agents

Authors: Michelle Belinda S. Weerawardana, Gobika Thiripuranathar, Priyani A. Paranagama

Abstract:

Most of fresh vegetables and fruits available in the retail markets undergo a physiological disorder in its appearance and coloration, which indeed discourages consumer purchase. A loss of millions of dollars yearly to the food industry had been due to this pronounced color reaction called Enzymatic Browning which is driven due to the catalytic activity by an oxidoreductase enzyme, polyphenol oxidase (PPO). The enzyme oxidizes the phenolic compounds which are abundantly available in fruits and vegetables as substrates into quinones, which could react with proteins in its surrounding to generate black pigments, called melanins, which are highly UV-active compounds. Annona muricata (Katu anoda) and Musa acuminata (Ash plantains) is a fruit and a vegetable consumed by Sri Lankans widely due to their high nutritional values, medicinal properties and economical importance. The objective of the present study was to evaluate and determine the effective natural anti-browning inhibitors that could prevent PPO activity in the selected fruit and vegetable. Enzyme extracts from Annona muricata (Katu anoda) and Musa acuminata (Ash plantains), were prepared by homogenizing with analytical grade acetone, and pH of each enzyme extract was maintained at 7.0 using a phosphate buffer. The extracts of inhibitors were prepared using powdered ginger rhizomes and essential oil from the bark of Cinnamomum zeylanicum. Water extracts of ginger were prepared and the essential oil from Ceylon cinnamon bark was extracted using steam distillation method. Since the essential oil is not soluble in water, 0.1µl of cinnamon bark oil was mixed with 0.1µl of Triton X-100 emulsifier and 5.00 ml of water. The effect of each inhibitor on the PPO activity was investigated using catechol (0.1 mol dm-3) as the substrate and two samples of enzyme extracts prepared. The dosages of the prepared Cinnamon bark oil, and ginger (2 samples) which were used to measure the activity were 0.0035 g/ml, 0.091 g/ml and 0.087 g/ml respectively. The measurements of the inhibitory activity were obtained at a wavelength of 525 nm using the UV-visible spectrophotometer. The results evaluated thus revealed that % inhibition observed with cinnamon bark oil, and ginger for Annona muricata was 51.97%, and 60.90% respectively. The effects of cinnamon bark oil, and ginger extract on PPO activity of Musa acuminata were 49.51%, and 48.10%. The experimental findings thus revealed that Cinnamomum zeylanicum bark oil was a more effective inhibitor for PPO enzyme present in Musa acuminata and ginger was effective for PPO enzyme present in Annona muricata. Overall both the inhibitors were proven to be more effective towards the activities of PPO enzyme present in both samples. These inhibitors can thus be corroborated as effective, natural, non-toxic, anti-browning extracts, which when added to the above fruit and vegetable will increase the shelf life and also the acceptance of the product by the consumers.

Keywords: anti-browning agent, enzymatic browning, inhibitory activity, polyphenol oxidase

Procedia PDF Downloads 274
5017 Bidirectional Dynamic Time Warping Algorithm for the Recognition of Isolated Words Impacted by Transient Noise Pulses

Authors: G. Tamulevičius, A. Serackis, T. Sledevič, D. Navakauskas

Abstract:

We consider the biggest challenge in speech recognition – noise reduction. Traditionally detected transient noise pulses are removed with the corrupted speech using pulse models. In this paper we propose to cope with the problem directly in Dynamic Time Warping domain. Bidirectional Dynamic Time Warping algorithm for the recognition of isolated words impacted by transient noise pulses is proposed. It uses simple transient noise pulse detector, employs bidirectional computation of dynamic time warping and directly manipulates with warping results. Experimental investigation with several alternative solutions confirms effectiveness of the proposed algorithm in the reduction of impact of noise on recognition process – 3.9% increase of the noisy speech recognition is achieved.

Keywords: transient noise pulses, noise reduction, dynamic time warping, speech recognition

Procedia PDF Downloads 556
5016 Problem of Services Selection in Ubiquitous Systems

Authors: Malika Yaici, Assia Arab, Betitra Yakouben, Samia Zermani

Abstract:

Ubiquitous computing is nowadays a reality through the networking of a growing number of computing devices. It allows providing users with context aware information and services in a heterogeneous environment, anywhere and anytime. Selection of the best context-aware service, between many available services and providers, is a tedious problem. In this paper, a service selection method based on Constraint Satisfaction Problem (CSP) formalism is proposed. The services are considered as variables and domains; and the user context, preferences and providers characteristics are considered as constraints. The Backtrack algorithm is used to solve the problem to find the best service and provider which matches the user requirements. Even though this algorithm has an exponential complexity, but its use guarantees that the service, that best matches the user requirements, will be found. A comparison of the proposed method with the existing solutions finishes the paper.

Keywords: ubiquitous computing, services selection, constraint satisfaction problem, backtrack algorithm

Procedia PDF Downloads 243
5015 Bit Error Rate Monitoring for Automatic Bias Control of Quadrature Amplitude Modulators

Authors: Naji Ali Albakay, Abdulrahman Alothaim, Isa Barshushi

Abstract:

The most common quadrature amplitude modulator (QAM) applies two Mach-Zehnder Modulators (MZM) and one phase shifter to generate high order modulation format. The bias of MZM changes over time due to temperature, vibration, and aging factors. The change in the biasing causes distortion to the generated QAM signal which leads to deterioration of bit error rate (BER) performance. Therefore, it is critical to be able to lock MZM’s Q point to the required operating point for good performance. We propose a technique for automatic bias control (ABC) of QAM transmitter using BER measurements and gradient descent optimization algorithm. The proposed technique is attractive because it uses the pertinent metric, BER, which compensates for bias drifting independently from other system variations such as laser source output power. The proposed scheme performance and its operating principles are simulated using OptiSystem simulation software for 4-QAM and 16-QAM transmitters.

Keywords: automatic bias control, optical fiber communication, optical modulation, optical devices

Procedia PDF Downloads 186