Search results for: robust optimization
316 Design Standardization in Aramco: Strategic Analysis
Authors: Mujahid S. Alharbi
Abstract:
The construction of process plants in oil and gas-producing countries, such as Saudi Arabia, necessitates substantial investment in design and building. Each new plant, while unique, includes common building types, suggesting an opportunity for design standardization. This study investigates the adoption of standardized Issue for Construction (IFC) packages for non-process buildings in Saudi Aramco. A SWOT analysis presents the strengths, weaknesses, opportunities, and threats of this approach. The approach's benefits are illustrated using the Hawiyah Unayzah Gas Reservoir Storage Program (HUGRSP) as a case study. Standardization not only offers significant cost savings and operational efficiencies, but also expedites project timelines, reduces the potential for change orders, and fosters local economic growth by allocating building tasks to local contractors. Standardization also improves project management by easing interface constraints between different contractors and promoting adaptability to future industry changes. This research underscores the standardization of non-process buildings as a powerful strategy for cost optimization, efficiency enhancement, and local economic development in process plant construction within the oil and gas sector.
Keywords: Building, construction, management, project, standardization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 62315 A Kernel Based Rejection Method for Supervised Classification
Authors: Abdenour Bounsiar, Edith Grall, Pierre Beauseroy
Abstract:
In this paper we are interested in classification problems with a performance constraint on error probability. In such problems if the constraint cannot be satisfied, then a rejection option is introduced. For binary labelled classification, a number of SVM based methods with rejection option have been proposed over the past few years. All of these methods use two thresholds on the SVM output. However, in previous works, we have shown on synthetic data that using thresholds on the output of the optimal SVM may lead to poor results for classification tasks with performance constraint. In this paper a new method for supervised classification with rejection option is proposed. It consists in two different classifiers jointly optimized to minimize the rejection probability subject to a given constraint on error rate. This method uses a new kernel based linear learning machine that we have recently presented. This learning machine is characterized by its simplicity and high training speed which makes the simultaneous optimization of the two classifiers computationally reasonable. The proposed classification method with rejection option is compared to a SVM based rejection method proposed in recent literature. Experiments show the superiority of the proposed method.Keywords: rejection, Chow's rule, error-reject tradeoff, SupportVector Machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1445314 Object Detection in Digital Images under Non-Standardized Conditions Using Illumination and Shadow Filtering
Authors: Waqqas-ur-Rehman Butt, Martin Servin, Marion Pause
Abstract:
In recent years, object detection has gained much attention and very encouraging research area in the field of computer vision. The robust object boundaries detection in an image is demanded in numerous applications of human computer interaction and automated surveillance systems. Many methods and approaches have been developed for automatic object detection in various fields, such as automotive, quality control management and environmental services. Inappropriately, to the best of our knowledge, object detection under illumination with shadow consideration has not been well solved yet. Furthermore, this problem is also one of the major hurdles to keeping an object detection method from the practical applications. This paper presents an approach to automatic object detection in images under non-standardized environmental conditions. A key challenge is how to detect the object, particularly under uneven illumination conditions. Image capturing conditions the algorithms need to consider a variety of possible environmental factors as the colour information, lightening and shadows varies from image to image. Existing methods mostly failed to produce the appropriate result due to variation in colour information, lightening effects, threshold specifications, histogram dependencies and colour ranges. To overcome these limitations we propose an object detection algorithm, with pre-processing methods, to reduce the interference caused by shadow and illumination effects without fixed parameters. We use the Y CrCb colour model without any specific colour ranges and predefined threshold values. The segmented object regions are further classified using morphological operations (Erosion and Dilation) and contours. Proposed approach applied on a large image data set acquired under various environmental conditions for wood stack detection. Experiments show the promising result of the proposed approach in comparison with existing methods.Keywords: Image processing, Illumination equalization, Shadow filtering, Object detection, Colour models, Image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1020313 Investigation of Passive Solutions of Thermal Comfort in Housing Aiming to Reduce Energy Consumption
Authors: Josiane R. Pires, Marco A. S. González, Bruna L. Brenner, Luciana S. Roos
Abstract:
The concern with sustainability brought the need for optimization of the buildings to reduce consumption of natural resources. Almost 1/3 of energy demanded by Brazilian housings is used to provide thermal solutions. AEC sector may contribute applying bioclimatic strategies on building design. The aim of this research is to investigate the viability of applying some alternative solutions in residential buildings. The research was developed with computational simulation on single family social housing, examining envelope type, absorptance, and insolation. The analysis of the thermal performance applied both Brazilian standard NBR 15575 and degree-hour method, in the scenery of Porto Alegre, a southern Brazilian city. We used BIM modeling through Revit/Autodesk and used Energy Plus to thermal simulation. The payback of the investment was calculated comparing energy savings and building costs, in a period of 50 years. The results shown that with the increment of envelope’s insulation there is thermal comfort improvement and energy economy, with a pay-back period of 24 to 36 years, in some cases.
Keywords: Civil construction, design, thermal performance, energy, economic analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1921312 Cessna Citation X Performances Improvement by an Adaptive Winglet during the Cruise Flight
Authors: Marine Segui, Simon Bezin, Ruxandra Mihaela Botez
Abstract:
As part of a ‘Morphing-Wing’ idea, this study consists of measuring how a winglet, which is able to change its shape during the flight, is efficient. Conventionally, winglets are fixed-vertical platforms at the wingtips, optimized for a cruise condition that the airplane should use most of the time. However, during a cruise, an airplane flies through a lot of cruise conditions corresponding to altitudes variations from 30,000 to 45,000 ft. The fixed winglets are not optimized for these variations, and consequently, they are supposed to generate some drag, and thus to deteriorate aircraft fuel consumption. This research assumes that it exists a winglet position that reduces the fuel consumption for each cruise condition. In this way, the methodology aims to find these optimal winglet positions, and to further simulate, and thus estimate the fuel consumption of an aircraft wearing this type of adaptive winglet during several cruise conditions. The adaptive winglet is assumed to have degrees of freedom given by the various changes of following surfaces: the tip chord, the sweep and the dihedral angles. Finally, results obtained during cruise simulations are presented in this paper. These results show that an adaptive winglet can reduce, thus improve up to 2.12% the fuel consumption of an aircraft during a cruise.Keywords: Aerodynamics, Cessna Citation X, optimization, winglet, adaptive, morphing, wing, aircraft.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1238311 Constant Factor Approximation Algorithm for p-Median Network Design Problem with Multiple Cable Types
Authors: Chaghoub Soraya, Zhang Xiaoyan
Abstract:
This research presents the first constant approximation algorithm to the p-median network design problem with multiple cable types. This problem was addressed with a single cable type and there is a bifactor approximation algorithm for the problem. To the best of our knowledge, the algorithm proposed in this paper is the first constant approximation algorithm for the p-median network design with multiple cable types. The addressed problem is a combination of two well studied problems which are p-median problem and network design problem. The introduced algorithm is a random sampling approximation algorithm of constant factor which is conceived by using some random sampling techniques form the literature. It is based on a redistribution Lemma from the literature and a steiner tree problem as a subproblem. This algorithm is simple, and it relies on the notions of random sampling and probability. The proposed approach gives an approximation solution with one constant ratio without violating any of the constraints, in contrast to the one proposed in the literature. This paper provides a (21 + 2)-approximation algorithm for the p-median network design problem with multiple cable types using random sampling techniques.Keywords: Approximation algorithms, buy-at-bulk, combinatorial optimization, network design, p-median.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 595310 Application of Genetic Algorithm for FACTS-based Controller Design
Authors: Sidhartha Panda, N. P. Padhy, R.N.Patel
Abstract:
In this paper, genetic algorithm (GA) opmization technique is applied to design Flexible AC Transmission System (FACTS)-based damping controllers. Two types of controller structures, namely a proportional-integral (PI) and a lead-lag (LL) are considered. The design problem of the proposed controllers is formulated as an optimization problem and GA is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. The proposed controllers are tested on a weakly connected power system subjected to different disturbances. The non-linear simulation results are presented to show the effectiveness of the proposed controller and their ability to provide efficient damping of low frequency oscillations. It is also observed that the proposed SSSC-based controllers improve greatly the voltage profile of the system under severe disturbances. Further, the dynamic performances of both the PI and LL structured FACTS-controller are analyzed at different loading conditions and under various disturbance condition as well as under unbalanced fault conditions..
Keywords: Genetic algorithm, proportional-integral controller, lead-lag controller, power system stability, FACTS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2544309 Optimization of Pretreatment and Enzymatic Saccharification of Cogon Grass Prior Ethanol Production
Authors: Jhalique Jane R. Fojas, Ernesto J. Del Rosario
Abstract:
The dilute acid pretreatment and enzymatic saccharification of lignocellulosic substrate, cogon grass (Imperata cylindrical, L.) was optimized prior ethanol fermentation using simultaneous saccharification and fermentation (SSF) method. The optimum pretreatment conditions, temperature, sulfuric acid concentration, and reaction time were evaluated by determining the maximum sugar yield at constant enzyme loading. Cogon grass, at 10% w/v substrate loading, has optimum pretreatment conditions of 126°C, 0.6% v/v H2SO4, and 20min reaction time. These pretreatment conditions were used to optimize enzymatic saccharification using different enzyme combinations. The maximum saccharification yield of 36.68mg/mL (71.29% reducing sugar) was obtained using 25FPU/g-cellulose cellulase complex combined with 1.1% w/w of cellobiase, ß-glucosidase, and 0.225% w/w of hemicellulase complex, after 96 hours of saccharification. Using the optimum pretreatment and saccharification conditions, SSF of treated substrates was done at 37°C for 120 hours using industrial yeast strain HBY3, Saccharomyces cerevisiae. The ethanol yield for cogon grass at 4% w/w loading was 9.11g/L with 5.74mg/mL total residual sugar.Keywords: Acid pretreatment, bioethanol, biomass, cogon grass, fermentation, lignocellylose, SSF.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3890308 Design and Optimization of Parity Generator and Parity Checker Based On Quantum-dot Cellular Automata
Authors: Santanu Santra, Utpal Roy
Abstract:
Quantum-dot Cellular Automata (QCA) is one of the most substitute emerging nanotechnologies for electronic circuits, because of lower power consumption, higher speed and smaller size in comparison with CMOS technology. The basic devices, a Quantum-dot cell can be used to implement logic gates and wires. As it is the fundamental building block on nanotechnology circuits. By applying XOR gate the hardware requirements for a QCA circuit can be decrease and circuits can be simpler in terms of level, delay and cell count. This article present a modest approach for implementing novel optimized XOR gate, which can be applied to design many variants of complex QCA circuits. Proposed XOR gate is simple in structure and powerful in terms of implementing any digital circuits. In order to verify the functionality of the proposed design some complex implementation of parity generator and parity checker circuits are proposed and simulating by QCA Designer tool and compare with some most recent design. Simulation results and physical relations confirm its usefulness in implementing every digital circuit.
Keywords: Clock, CMOS technology, Logic gates, QCA Designer, Quantum-dot Cellular Automata (QCA).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7837307 The Effect of Randomly Distributed Polypropylene Fibers Borogypsum Fly Ash and Cement on Freezing-Thawing Durability of a Fine-Grained Soil
Authors: Ahmet Şahin Zaimoğlu
Abstract:
A number of studies have been conducted recently to investigate the influence of randomly oriented fibers on some engineering properties of cohesive and cohesionless soils. However, few studies have been carried out on freezing-thawing behavior of fine-grained soils modified with discrete fiber inclusions and additive materials. This experimental study was performed to investigate the effect of randomly distributed polypropylene fibers (PP) and some additive materials [e.g.., borogypsum (BG), fly ash (FA) and cement (C)] on freezing-thawing durability (mass losses) of a fine-grained soil for 6, 12, and 18 cycles. The Taguchi method was applied to the experiments and a standard L9 orthogonal array (OA) with four factors and three levels were chosen. A series of freezing-thawing tests were conducted on each specimen. 0-20% BG, 0-20% FA, 0- 0.25% PP and 0-3% of C by total dry weight of mixture were used in the preparation of specimens. Experimental results showed that the most effective materials for the freezing-thawing durability (mass losses) of the samples were borogypsum and fly ash. The values of mass losses for 6, 12 and 18 cycles in optimum conditions were 16.1%, 5.1% and 3.6%, respectively.Keywords: Additive materials, Freezing-thawing, Optimization, Reinforced soil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1737306 Genetic-Based Multi Resolution Noisy Color Image Segmentation
Authors: Raghad Jawad Ahmed
Abstract:
Segmentation of a color image composed of different kinds of regions can be a hard problem, namely to compute for an exact texture fields. The decision of the optimum number of segmentation areas in an image when it contains similar and/or un stationary texture fields. A novel neighborhood-based segmentation approach is proposed. A genetic algorithm is used in the proposed segment-pass optimization process. In this pass, an energy function, which is defined based on Markov Random Fields, is minimized. In this paper we use an adaptive threshold estimation method for image thresholding in the wavelet domain based on the generalized Gaussian distribution (GGD) modeling of sub band coefficients. This method called Normal Shrink is computationally more efficient and adaptive because the parameters required for estimating the threshold depend on sub band data energy that used in the pre-stage of segmentation. A quad tree is employed to implement the multi resolution framework, which enables the use of different strategies at different resolution levels, and hence, the computation can be accelerated. The experimental results using the proposed segmentation approach are very encouraging.Keywords: Color image segmentation, Genetic algorithm, Markov random field, Scale space filter.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1576305 New Approach for Minimizing Wavelength Fragmentation in Wavelength-Routed WDM Networks
Authors: Sami Baraketi, Jean-Marie Garcia, Olivier Brun
Abstract:
Wavelength Division Multiplexing (WDM) is the dominant transport technology used in numerous high capacity backbone networks, based on optical infrastructures. Given the importance of costs (CapEx and OpEx) associated to these networks, resource management is becoming increasingly important, especially how the optical circuits, called “lightpaths”, are routed throughout the network. This requires the use of efficient algorithms which provide routing strategies with the lowest cost. We focus on the lightpath routing and wavelength assignment problem, known as the RWA problem, while optimizing wavelength fragmentation over the network. Wavelength fragmentation poses a serious challenge for network operators since it leads to the misuse of the wavelength spectrum, and then to the refusal of new lightpath requests. In this paper, we first establish a new Integer Linear Program (ILP) for the problem based on a node-link formulation. This formulation is based on a multilayer approach where the original network is decomposed into several network layers, each corresponding to a wavelength. Furthermore, we propose an efficient heuristic for the problem based on a greedy algorithm followed by a post-treatment procedure. The obtained results show that the optimal solution is often reached. We also compare our results with those of other RWA heuristic methods
Keywords: WDM, lightpath, RWA, wavelength fragmentation, optimization, linear programming, heuristic
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1871304 Mass Transfer Modeling of Nitrate in an Ion Exchange Selective Resin
Authors: A. A. Hekmatzadeh, A. Karimi-Jashani, N. Talebbeydokhti
Abstract:
The rate of nitrate adsorption by a nitrate selective ion exchange resin was investigated in a well-stirred batch experiments. The kinetic experimental data were simulated with diffusion models including external mass transfer, particle diffusion and chemical adsorption. Particle pore volume diffusion and particle surface diffusion were taken into consideration separately and simultaneously in the modeling. The model equations were solved numerically using the Crank-Nicholson scheme. An optimization technique was employed to optimize the model parameters. All nitrate concentration decay data were well described with the all diffusion models. The results indicated that the kinetic process is initially controlled by external mass transfer and then by particle diffusion. The external mass transfer coefficient and the coefficients of pore volume diffusion and surface diffusion in all experiments were close to each other with the average value of 8.3×10-3 cm/S for external mass transfer coefficient. In addition, the models are more sensitive to the mass transfer coefficient in comparison with particle diffusion. Moreover, it seems that surface diffusion is the dominant particle diffusion in comparison with pore volume diffusion.Keywords: External mass transfer, pore volume diffusion, surface diffusion, mass action law isotherm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2241303 Performance Analysis of Deterministic Stable Election Protocol Using Fuzzy Logic in Wireless Sensor Network
Authors: Sumanpreet Kaur, Harjit Pal Singh, Vikas Khullar
Abstract:
In Wireless Sensor Network (WSN), the sensor containing motes (nodes) incorporate batteries that can lament at some extent. To upgrade the energy utilization, clustering is one of the prototypical approaches for split sensor motes into a number of clusters where one mote (also called as node) proceeds as a Cluster Head (CH). CH selection is one of the optimization techniques for enlarging stability and network lifespan. Deterministic Stable Election Protocol (DSEP) is an effectual clustering protocol that makes use of three kinds of nodes with dissimilar residual energy for CH election. Fuzzy Logic technology is used to expand energy level of DSEP protocol by using fuzzy inference system. This paper presents protocol DSEP using Fuzzy Logic (DSEP-FL) CH by taking into account four linguistic variables such as energy, concentration, centrality and distance to base station. Simulation results show that our proposed method gives more effective results in term of a lifespan of network and stability as compared to the performance of other clustering protocols.
Keywords: Deterministic stable election protocol, energy model, fuzzy logic, wireless sensor network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 977302 Modeling and Parametric Study for CO2/CH4 Separation Using Membrane Processes
Authors: Faizan Ahmad, Lau Kok Keong, Azmi Mohd. Shariff
Abstract:
The upgrading of low quality crude natural gas (NG) is attracting interest due to high demand of pipeline-grade gas in recent years. Membrane processes are commercially proven technology for the removal of impurities like carbon dioxide from NG. In this work, cross flow mathematical model has been suggested to be incorporated with ASPEN HYSYS as a user defined unit operation in order to design the membrane system for CO2/CH4 separation. The effect of operating conditions (such as feed composition and pressure) and membrane selectivity on the design parameters (methane recovery and total membrane area required for the separation) has been studied for different design configurations. These configurations include single stage (with and without recycle) and double stage membrane systems (with and without permeate or retentate recycle). It is shown that methane recovery can be improved by recycling permeate or retentate stream as well as by using double stage membrane systems. The ASPEN HYSYS user defined unit operation proposed in the study has potential to be applied for complex membrane system design and optimization.
Keywords: CO2/CH4 Separation, Membrane Process, Membrane modeling, Natural Gas Processing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3858301 Optimization of Process Parameters Affecting on Spring-Back in V-Bending Process for High Strength Low Alloy Steel HSLA 420 Using FEA (HyperForm) and Taguchi Technique
Authors: Navajyoti Panda, R. S. Pawar
Abstract:
In this study, process parameters like punch angle, die opening, grain direction, and pre-bend condition of the strip for deep draw of high strength low alloy steel HSLA 420 are investigated. The finite element method (FEM) in association with the Taguchi and the analysis of variance (ANOVA) techniques are carried out to investigate the degree of importance of process parameters in V-bending process for HSLA 420&ST12 grade material. From results, it is observed that punch angle had a major influence on the spring-back. Die opening also showed very significant role on spring back. On the other hand, it is revealed that grain direction had the least impact on spring back; however, if strip from flat sheet is taken, then it is less prone to spring back as compared to the strip from sheet metal coil. HyperForm software is used for FEM simulation and experiments are designed using Taguchi method. Percentage contribution of the parameters is obtained through the ANOVA techniques.
Keywords: Bending, V-bending, FEM, spring-back, Taguchi, HyperForm, profile projector, HSLA 420 & St12 materials.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1450300 Land Suitability Analysis for Maize Production in Egbeda Local Government Area of Oyo State Using GIS Techniques
Authors: Abegunde Linda, Adedeji Oluwatola, Tope-Ajayi Opeyemi
Abstract:
Maize constitutes a major agrarian production for use by the vast population but despite its economic importance; it has not been produced to meet the economic needs of the country. Achieving optimum yield in maize can meaningfully be supported by land suitability analysis in order to guarantee self-sufficiency for future production optimization. This study examines land suitability for maize production through the analysis of the physicochemical variations in soil properties and other land attributes over space using a Geographic Information System (GIS) framework. Physicochemical parameters of importance selected include slope, landuse, physical and chemical properties of the soil, and climatic variables. Landsat imagery was used to categorize the landuse, Shuttle Radar Topographic Mapping (SRTM) generated the slope and soil samples were analyzed for its physical and chemical components. Suitability was categorized into highly, moderately and marginally suitable based on Food and Agricultural Organisation (FAO) classification, using the Analytical Hierarchy Process (AHP) technique of GIS. This result can be used by small scale farmers for efficient decision making in the allocation of land for maize production.
Keywords: AHP, GIS, MCE, Suitability, Zea mays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3889299 Multi-Agent Simulation of Wayfinding for Rescue Operation during Building Fire
Authors: G. Sokhansefat, M. Delavar, M. Banedj-Schafii
Abstract:
Recently research on human wayfinding has focused mainly on mental representations rather than processes of wayfinding. The objective of this paper is to demonstrate the rationality behind applying multi-agent simulation paradigm to the modeling of rescuer team wayfinding in order to develop computational theory of perceptual wayfinding in crisis situations using image schemata and affordances, which explains how people find a specific destination in an unfamiliar building such as a hospital. The hypothesis of this paper is that successful navigation is possible if the agents are able to make the correct decision through well-defined cues in critical cases, so the design of the building signage is evaluated through the multi-agent-based simulation. In addition, a special case of wayfinding in a building, finding one-s way through three hospitals, is used to demonstrate the model. Thereby, total rescue time for rescue operation during building fire is computed. This paper discuses the computed rescue time for various signage localization and provides experimental result for optimization of building signage design. Therefore the most appropriate signage design resulted in the shortest total rescue time in various situations.Keywords: Multi-Agent system (MAS), Spatial Cognition, Wayfinding, Indoor Environment, Geospatial Information System (GIS).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2381298 Process and Supply-Chain Optimization for Testing and Verification of Formation Tester/Pressure-While- Drilling Tools
Authors: Vivek V, Hafeez Syed, Darren W Terrell, Harit Naik, Halliburton
Abstract:
Applying a rigorous process to optimize the elements of a supply-chain network resulted in reduction of the waiting time for a service provider and customer. Different sources of downtime of hydraulic pressure controller/calibrator (HPC) were causing interruptions in the operations. The process examined all the issues to drive greater efficiencies. The issues included inherent design issues with HPC pump, contamination of the HPC with impurities, and the lead time required for annual calibration in the USA. HPC is used for mandatory testing/verification of formation tester/pressure measurement/logging-while drilling tools by oilfield service providers, including Halliburton. After market study andanalysis, it was concluded that the current HPC model is best suited in the oilfield industry. To use theexisting HPC model effectively, design andcontamination issues were addressed through design and process improvements. An optimum network is proposed after comparing different supply-chain models for calibration lead-time reduction.Keywords: Hydraulic Pressure Controller/Calibrator, M/LWD, Pressure, FTWD
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1453297 Prioritization Method in the Fuzzy Analytic Network Process by Fuzzy Preferences Programming Method
Authors: Tarifa S. Almulhim, Ludmil Mikhailov, Dong-Ling Xu
Abstract:
In this paper, a method for deriving a group priority vector in the Fuzzy Analytic Network Process (FANP) is proposed. By introducing importance weights of multiple decision makers (DMs) based on their experiences, the Fuzzy Preferences Programming Method (FPP) is extended to a fuzzy group prioritization problem in the FANP. Additionally, fuzzy pair-wise comparison judgments are presented rather than exact numerical assessments in order to model the uncertainty and imprecision in the DMs- judgments and then transform the fuzzy group prioritization problem into a fuzzy non-linear programming optimization problem which maximize the group satisfaction. Unlike the known fuzzy prioritization techniques, the new method proposed in this paper can easily derive crisp weights from incomplete and inconsistency fuzzy set of comparison judgments and does not require additional aggregation producers. Detailed numerical examples are used to illustrate the implement of our approach and compare with the latest fuzzy prioritization method.
Keywords: Fuzzy Analytic Network Process (FANP), Fuzzy Non-linear Programming, Fuzzy Preferences Programming Method (FPP), Multiple Criteria Decision-Making (MCDM), Triangular Fuzzy Number.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2386296 Satellite Imagery Classification Based on Deep Convolution Network
Authors: Zhong Ma, Zhuping Wang, Congxin Liu, Xiangzeng Liu
Abstract:
Satellite imagery classification is a challenging problem with many practical applications. In this paper, we designed a deep convolution neural network (DCNN) to classify the satellite imagery. The contributions of this paper are twofold — First, to cope with the large-scale variance in the satellite image, we introduced the inception module, which has multiple filters with different size at the same level, as the building block to build our DCNN model. Second, we proposed a genetic algorithm based method to efficiently search the best hyper-parameters of the DCNN in a large search space. The proposed method is evaluated on the benchmark database. The results of the proposed hyper-parameters search method show it will guide the search towards better regions of the parameter space. Based on the found hyper-parameters, we built our DCNN models, and evaluated its performance on satellite imagery classification, the results show the classification accuracy of proposed models outperform the state of the art method.
Keywords: Satellite imagery classification, deep convolution network, genetic algorithm, hyper-parameter optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2346295 Toward Indoor and Outdoor Surveillance Using an Improved Fast Background Subtraction Algorithm
Authors: A. El Harraj, N. Raissouni
Abstract:
The detection of moving objects from a video image sequences is very important for object tracking, activity recognition, and behavior understanding in video surveillance. The most used approach for moving objects detection / tracking is background subtraction algorithms. Many approaches have been suggested for background subtraction. But, these are illumination change sensitive and the solutions proposed to bypass this problem are time consuming. In this paper, we propose a robust yet computationally efficient background subtraction approach and, mainly, focus on the ability to detect moving objects on dynamic scenes, for possible applications in complex and restricted access areas monitoring, where moving and motionless persons must be reliably detected. It consists of three main phases, establishing illumination changes invariance, background/foreground modeling and morphological analysis for noise removing. We handle illumination changes using Contrast Limited Histogram Equalization (CLAHE), which limits the intensity of each pixel to user determined maximum. Thus, it mitigates the degradation due to scene illumination changes and improves the visibility of the video signal. Initially, the background and foreground images are extracted from the video sequence. Then, the background and foreground images are separately enhanced by applying CLAHE. In order to form multi-modal backgrounds we model each channel of a pixel as a mixture of K Gaussians (K=5) using Gaussian Mixture Model (GMM). Finally, we post process the resulting binary foreground mask using morphological erosion and dilation transformations to remove possible noise. For experimental test, we used a standard dataset to challenge the efficiency and accuracy of the proposed method on a diverse set of dynamic scenes.
Keywords: Video surveillance, background subtraction, Contrast Limited Histogram Equalization, illumination invariance, object tracking, object detection, behavior understanding, dynamic scenes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2088294 Statistical Optimization of Process Variables for Direct Fermentation of 226 White Rose Tapioca Stem to Ethanol by Fusarium oxysporum
Authors: A. Magesh, B. Preetha, T. Viruthagiri
Abstract:
Direct fermentation of 226 white rose tapioca stem to ethanol by Fusarium oxysporum was studied in a batch reactor. Fermentation of ethanol can be achieved by sequential pretreatment using dilute acid and dilute alkali solutions using 100 mesh tapioca stem particles. The quantitative effects of substrate concentration, pH and temperature on ethanol concentration were optimized using a full factorial central composite design experiment. The optimum process conditions were then obtained using response surface methodology. The quadratic model indicated that substrate concentration of 33g/l, pH 5.52 and a temperature of 30.13oC were found to be optimum for maximum ethanol concentration of 8.64g/l. The predicted optimum process conditions obtained using response surface methodology was verified through confirmatory experiments. Leudeking-piret model was used to study the product formation kinetics for the production of ethanol and the model parameters were evaluated using experimental data.Keywords: Fusarium oxysporum, Lignocellulosic biomass, Product formation kinetics, Statistical experimental design
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1640293 Progressive Watershed Management Approaches in Iran
Authors: S. H. R. Sadeghi, A. Sadoddin, A. Najafinejad
Abstract:
Expansionism and ever-increasing population menace all different resources worldwide. The issue, hence, is critical in developing countries like Iran where new technologies are rapidly luxuriated and unguardedly applied, resulting in unexpected outcomes. However, uncommon and comprehensive approaches are introduced to take all the different aspects involved into consideration. In the last decade, few approaches such as community-based, stakeholders-oriented, adaptive and ultimately integrated management, have emerged and are developing for efficient, Co-management or best management, economic and sustainable development and management of watershed resources in Iran. In the present paper, an attempt has been made to focus on state-of-the-art approaches for the management of watershed resources applied in Iran. The study has been then supported by reports of some case studies conducted throughout the country involving previously mentioned approaches. Scrutinizing results of the researches verified a progressive tendency of the managerial approaches in watershed management strategies leading to a general approaching balance situation. The approaches are firmly rooted in educational, research, executive, legal and policy-making sectors leading to some recuperation at different levels. However, there is a long way ahead to naturalize detrimental effects of unscientific, illegal and over exploitation of the watershed resources in Iran.
Keywords: Comprehensive management, ecosystem balance, integrated watershed management, land resources optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1024292 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding
Authors: K. Anitha Sheela, J. Tarun Kumar
Abstract:
HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2048291 Truck Routing Problem Considering Platooning and Drivers’ Breaks
Authors: Xiaoyuan Yan, Min Xu
Abstract:
Truck platooning refers to a convoy of digitally connected automated trucks traveling safely with a small inter-vehicle gap. It has been identified as one of the most promising and applicable technologies towards automated and sustainable freight transportation. Although truck platooning delivers significant energy-saving benefits, it cannot be realized without good coordination of drivers’ shifts to lead the platoons subject to their mandatory breaks. Therefore, this study aims to route a fleet of trucks to their destinations using the least amount of fuel by maximizing platoon opportunities under the regulations of drivers’ mandatory breaks. We formulate this platoon coordination problem as a mixed-integer linear programming problem and solve it by CPLEX. Numerical experiments are conducted to demonstrate the effectiveness and efficiency of our proposed model. In addition, we also explore the impacts of drivers’ compulsory breaks on the fuel-savings performance. The results show a slight increase in the total fuel costs in the presence of drivers’ compulsory breaks, thanks to driving-while-resting benefit provided for the trailing trucks. This study may serve as a guide for the operators of automated freight transportation.
Keywords: Truck platooning, route optimization, compulsory breaks, energy saving.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 617290 A Few Descriptive and Optimization Issues on the Material Flow at a Research-Academic Institution: The Role of Simulation
Authors: D. R. Delgado Sobrino, P. Košťál, J. Oravcová
Abstract:
Lately, significant work in the area of Intelligent Manufacturing has become public and mainly applied within the frame of industrial purposes. Special efforts have been made in the implementation of new technologies, management and control systems, among many others which have all evolved the field. Aware of all this and due to the scope of new projects and the need of turning the existing flexible ideas into more autonomous and intelligent ones, i.e.: Intelligent Manufacturing, the present paper emerges with the main aim of contributing to the design and analysis of the material flow in either systems, cells or work stations under this new “intelligent" denomination. For this, besides offering a conceptual basis in some of the key points to be taken into account and some general principles to consider in the design and analysis of the material flow, also some tips on how to define other possible alternative material flow scenarios and a classification of the states a system, cell or workstation are offered as well. All this is done with the intentions of relating it with the use of simulation tools, for which these have been briefly addressed with a special focus on the Witness simulation package. For a better comprehension, the previous elements are supported by a detailed layout, other figures and a few expressions which could help obtaining necessary data. Such data and others will be used in the future, when simulating the scenarios in the search of the best material flow configurations.Keywords: Flexible/Intelligent Manufacturing System/Cell (F/IMS/C), material flow/design/configuration (MF/D/C), workstation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611289 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory
Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi
Abstract:
One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm, to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.Keywords: Rough Set Theory, Attribute Reduction, Fuzzy Logic, Memetic Algorithms, Record to Record Algorithm, Great Deluge Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1937288 Generational PipeLined Genetic Algorithm (PLGA)using Stochastic Selection
Authors: Malay K. Pakhira, Rajat K. De
Abstract:
In this paper, a pipelined version of genetic algorithm, called PLGA, and a corresponding hardware platform are described. The basic operations of conventional GA (CGA) are made pipelined using an appropriate selection scheme. The selection operator, used here, is stochastic in nature and is called SA-selection. This helps maintaining the basic generational nature of the proposed pipelined GA (PLGA). A number of benchmark problems are used to compare the performances of conventional roulette-wheel selection and the SA-selection. These include unimodal and multimodal functions with dimensionality varying from very small to very large. It is seen that the SA-selection scheme is giving comparable performances with respect to the classical roulette-wheel selection scheme, for all the instances, when quality of solutions and rate of convergence are considered. The speedups obtained by PLGA for different benchmarks are found to be significant. It is shown that a complete hardware pipeline can be developed using the proposed scheme, if parallel evaluation of the fitness expression is possible. In this connection a low-cost but very fast hardware evaluation unit is described. Results of simulation experiments show that in a pipelined hardware environment, PLGA will be much faster than CGA. In terms of efficiency, PLGA is found to outperform parallel GA (PGA) also.Keywords: Hardware evaluation, Hardware pipeline, Optimization, Pipelined genetic algorithm, SA-selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1443287 A Grey-Fuzzy Controller for Optimization Technique in Wireless Networks
Authors: Yao-Tien Wang, Hsiang-Fu Yu, Dung Chen Chiou
Abstract:
In wireless and mobile communications, this progress provides opportunities for introducing new standards and improving existing services. Supporting multimedia traffic with wireless networks quality of service (QoS). In this paper, a grey-fuzzy controller for radio resource management (GF-RRM) is presented to maximize the number of the served calls and QoS provision in wireless networks. In a wireless network, the call arrival rate, the call duration and the communication overhead between the base stations and the control center are vague and uncertain. In this paper, we develop a method to predict the cell load and to solve the RRM problem based on the GF-RRM, and support the present facility has been built on the application-level of the wireless networks. The GF-RRM exhibits the better adaptability, fault-tolerant capability and performance than other algorithms. Through simulations, we evaluate the blocking rate, update overhead, and channel acquisition delay time of the proposed method. The results demonstrate our algorithm has the lower blocking rate, less updated overhead, and shorter channel acquisition delay.Keywords: radio resource management, grey prediction, fuzzylogic control, wireless networks, quality of service.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1717