Search results for: simultaneous optimization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2051

Search results for: simultaneous optimization

1091 Reliability and Cost Focused Optimization Approach for a Communication Satellite Payload Redundancy Allocation Problem

Authors: Mehmet Nefes, Selman Demirel, Hasan H. Ertok, Cenk Sen

Abstract:

A typical reliability engineering problem regarding communication satellites has been considered to determine redundancy allocation scheme of power amplifiers within payload transponder module, whose dominant function is to amplify power levels of the received signals from the Earth, through maximizing reliability against mass, power, and other technical limitations. Adding each redundant power amplifier component increases not only reliability but also hardware, testing, and launch cost of a satellite. This study investigates a multi-objective approach used in order to solve Redundancy Allocation Problem (RAP) for a communication satellite payload transponder, focusing on design cost due to redundancy and reliability factors. The main purpose is to find the optimum power amplifier redundancy configuration satisfying reliability and capacity thresholds simultaneously instead of analyzing respectively or independently. A mathematical model and calculation approach are instituted including objective function definitions, and then, the problem is solved analytically with different input parameters in MATLAB environment. Example results showed that payload capacity and failure rate of power amplifiers have remarkable effects on the solution and also processing time.

Keywords: Communication satellite payload, multi-objective optimization, redundancy allocation problem, reliability, transponder.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1192
1090 Unified Fusion Approach with Application to SLAM

Authors: Xinde Li, Xinhan Huang, Min Wang

Abstract:

In this paper, we propose the pre-processor based on the Evidence Supporting Measure of Similarity (ESMS) filter and also propose the unified fusion approach (UFA) based on the general fusion machine coupled with ESMS filter, which improve the correctness and precision of information fusion in any fields of application. Here we mainly apply the new approach to Simultaneous Localization And Mapping (SLAM) of Pioneer II mobile robots. A simulation experiment was performed, where an autonomous virtual mobile robot with sonar sensors evolves in a virtual world map with obstacles. By comparing the result of building map according to the general fusion machine (here DSmT-based fusing machine and PCR5-based conflict redistributor considereded) coupling with ESMS filter and without ESMS filter, it shows the benefit of the selection of the sources as a prerequisite for improvement of the information fusion, and also testifies the superiority of the UFA in dealing with SLAM.

Keywords: DSmT, ESMS filter, SLAM, UFA

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1350
1089 Design and Optimization for a Compliant Gripper with Force Regulation Mechanism

Authors: Nhat Linh Ho, Thanh-Phong Dao, Shyh-Chour Huang, Hieu Giang Le

Abstract:

This paper presents a design and optimization for a compliant gripper. The gripper is constructed based on the concept of compliant mechanism with flexure hinge. A passive force regulation mechanism is presented to control the grasping force a micro-sized object instead of using a sensor force. The force regulation mechanism is designed using the planar springs. The gripper is expected to obtain a large range of displacement to handle various sized objects. First of all, the statics and dynamics of the gripper are investigated by using the finite element analysis in ANSYS software. And then, the design parameters of the gripper are optimized via Taguchi method. An orthogonal array L9 is used to establish an experimental matrix. Subsequently, the signal to noise ratio is analyzed to find the optimal solution. Finally, the response surface methodology is employed to model the relationship between the design parameters and the output displacement of the gripper. The design of experiment method is then used to analyze the sensitivity so as to determine the effect of each parameter on the displacement. The results showed that the compliant gripper can move with a large displacement of 213.51 mm and the force regulation mechanism is expected to be used for high precision positioning systems.

Keywords: Flexure hinge, compliant mechanism, compliant gripper, force regulation mechanism, Taguchi method, response surface methodology, design of experiment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1613
1088 Heterogeneity-Aware Load Balancing for Multimedia Access over Wireless LAN Hotspots

Authors: Yen-Cheng Chen, Gong-Da Fang

Abstract:

Wireless LAN (WLAN) access in public hotspot areas becomes popular in the recent years. Since more and more multimedia information is available in the Internet, there is an increasing demand for accessing multimedia information through WLAN hotspots. Currently, the bandwidth offered by an IEEE 802.11 WLAN cannot afford many simultaneous real-time video accesses. A possible way to increase the offered bandwidth in a hotspot is the use of multiple access points (APs). However, a mobile station is usually connected to the WLAN AP with the strongest received signal strength indicator (RSSI). The total consumed bandwidth cannot be fairly allocated among those APs. In this paper, we will propose an effective load-balancing scheme via the support of the IAPP and SNMP in APs. The proposed scheme is an open solution and doesn-t need any changes in both wireless stations and APs. This makes load balancing possible in WLAN hotspots, where a variety of heterogeneous mobile devices are employed.

Keywords: Wireless LAN, Load balancing, IAPP, SNMP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
1087 Heavy Metals Estimation in Coastal Areas Using Remote Sensing, Field Sampling and Classical and Robust Statistic

Authors: Elena Castillo-López, Raúl Pereda, Julio Manuel de Luis, Rubén Pérez, Felipe Piña

Abstract:

Sediments are an important source of accumulation of toxic contaminants within the aquatic environment. Bioassays are a powerful tool for the study of sediments in relation to their toxicity, but they can be expensive. This article presents a methodology to estimate the main physical property of intertidal sediments in coastal zones: heavy metals concentration. This study, which was developed in the Bay of Santander (Spain), applies classical and robust statistic to CASI-2 hyperspectral images to estimate heavy metals presence and ecotoxicity (TOC). Simultaneous fieldwork (radiometric and chemical sampling) allowed an appropriate atmospheric correction to CASI-2 images.

Keywords: Remote sensing, intertidal sediment, airborne sensors, heavy metals, ecotoxicity, robust statistic, estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1253
1086 A Comparative Analysis of Heuristics Applied to Collecting Used Lubricant Oils Generated in the City of Pereira, Colombia

Authors: Diana Fajardo, Sebastián Ortiz, Oscar Herrera, Angélica Santis

Abstract:

Currently, in Colombia is arising a problem related to collecting used lubricant oils which are generated by the increment of the vehicle fleet. This situation does not allow a proper disposal of this type of waste, which in turn results in a negative impact on the environment. Therefore, through the comparative analysis of various heuristics, the best solution to the VRP (Vehicle Routing Problem) was selected by comparing costs and times for the collection of used lubricant oils in the city of Pereira, Colombia; since there is no presence of management companies engaged in the direct administration of the collection of this pollutant. To achieve this aim, six proposals of through methods of solution of two phases were discussed. First, the assignment of the group of generator points of the residue was made (previously identified). Proposals one and four of through methods are based on the closeness of points. The proposals two and five are using the scanning method and the proposals three and six are considering the restriction of the capacity of collection vehicle. Subsequently, the routes were developed - in the first three proposals by the Clarke and Wright's savings algorithm and in the following proposals by the Traveling Salesman optimization mathematical model. After applying techniques, a comparative analysis of the results was performed and it was determined which of the proposals presented the most optimal values in terms of the distance, cost and travel time.

Keywords: Heuristics, optimization model, savings algorithm used vehicular oil, VRP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1314
1085 High Performance Liquid Chromatography Determination of Urinary Hippuric Acid and Benzoic Acid as Indices for Glue Sniffer Urine

Authors: Abdul Rahim Yacob, Mohamad Raizul Zinalibdin

Abstract:

A simple method for the simultaneous determination of hippuric acid and benzoic acid in urine using reversed-phase high performance liquid chromatography was described. Chromatography was performed on a Nova-Pak C18 (3.9 x 150 mm) column with a mobile phase of mixed solution methanol: water: acetic acid (20:80:0.2) and UV detection at 254 nm. The calibration curve was linear within concentration range at 0.125 to 6.0 mg/ml of hippuric acid and benzoic acid. The recovery, accuracy and coefficient variance of hippuric acid were 104.54%, 0.2% and 0.2% respectively and for benzoic acid were 98.48%, 1.25% and 0.60% respectively. The detection limit of this method was 0.01ng/l for hippuric acid and 0.06ng/l for benzoic acid. This method has been applied to the analysis of urine samples from the suspected of toluene abuser or glue sniffer among secondary school students at Johor Bahru.

Keywords: Glue sniffer, High Performance LiquidChromatography, Hippuric Acid, Toluene, Urine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3371
1084 Optimization of Ethanol Fermentation from Pineapple Peel Extract Using Response Surface Methodology (RSM)

Authors: Nadya Hajar, Zainal, S., Atikah, O., Tengku Elida, T. Z. M.

Abstract:

Ethanol has been known for a long time, being perhaps the oldest product obtained through traditional biotechnology fermentation. Agriculture waste as substrate in fermentation is vastly discussed as alternative to replace edible food and utilization of organic material. Pineapple peel, highly potential source as substrate is a by-product of the pineapple processing industry. Bio-ethanol from pineapple (Ananas comosus) peel extract was carried out by controlling fermentation without any treatment. Saccharomyces ellipsoides was used as inoculum in this fermentation process as it is naturally found at the pineapple skin. In this study, the capability of Response Surface Methodology (RSM) for optimization of ethanol production from pineapple peel extract using Saccharomyces ellipsoideus in batch fermentation process was investigated. Effect of five test variables in a defined range of inoculum concentration 6- 14% (v/v), pH (4.0-6.0), sugar concentration (14-22°Brix), temperature (24-32°C) and time of incubation (30-54 hrs) on the ethanol production were evaluated. Data obtained from experiment were analyzed with RSM of MINITAB Software (Version 15) whereby optimum ethanol concentration of 8.637% (v/v) was determined. The optimum condition of 14% (v/v) inoculum concentration, pH 6, 22°Brix, 26°C and 30hours of incubation. The significant regression equation or model at the 5% level with correlation value of 99.96% was also obtained.

Keywords: Bio-ethanol, pineapple peel extract, Response Surface Methodology (RSM), Saccharomyces ellipsoideus.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6095
1083 Optimal Allocation Between Subprime Structured Mortgage Products and Treasuries

Authors: MP. Mulaudzi, MA. Petersen, J. Mukuddem-Petersen , IM. Schoeman, B. de Waal, JM. Manale

Abstract:

This conference paper discusses a risk allocation problem for subprime investing banks involving investment in subprime structured mortgage products (SMPs) and Treasuries. In order to solve this problem, we develop a L'evy process-based model of jump diffusion-type for investment choice in subprime SMPs and Treasuries. This model incorporates subprime SMP losses for which credit default insurance in the form of credit default swaps (CDSs) can be purchased. In essence, we solve a mean swap-at-risk (SaR) optimization problem for investment which determines optimal allocation between SMPs and Treasuries subject to credit risk protection via CDSs. In this regard, SaR is indicative of how much protection investors must purchase from swap protection sellers in order to cover possible losses from SMP default. Here, SaR is defined in terms of value-at-risk (VaR). Finally, we provide an analysis of the aforementioned optimization problem and its connections with the subprime mortgage crisis (SMC).

Keywords: Investors; Jump Diffusion Process, Structured Mortgage Products, Treasuries, Credit Risk, Credit Default Swaps, Tranching Risk, Counterparty Risk, Value-at-Risk, Swaps-at-Risk, Subprime Mortgage Crisis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
1082 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: Additive manufacturing, fused deposition modeling, surface roughness, Six-Sigma, Taguchi method, 3D printing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1390
1081 A Generic Middleware to Instantly Sync Intensive Writes of Heterogeneous Massive Data via Internet

Authors: Haitao Yang, Zhenjiang Ruan, Fei Xu, Lanting Xia

Abstract:

Industry data centers often need to sync data changes reliably and instantly from a large-scale of heterogeneous autonomous relational databases accessed via the not-so-reliable Internet, for which a practical generic sync middleware of low maintenance and operation costs is most wanted. To this demand, this paper presented a generic sync middleware system (GSMS), which has been developed, applied and optimized since 2006, holding the principles or advantages that it must be SyncML-compliant and transparent to data application layer logic without referring to implementation details of databases synced, does not rely on host computer operating systems deployed, and its construction is light weighted and hence of low cost. Regarding these hard commitments of developing GSMS, in this paper we stressed the significant optimization breakthrough of GSMS sync delay being well below a fraction of millisecond per record sync. A series of ultimate tests with GSMS sync performance were conducted for a persuasive example, in which the source relational database underwent a broad range of write loads (from one thousand to one million intensive writes within a few minutes). All these tests showed that the performance of GSMS is competent and smooth even under ultimate write loads.

Keywords: Heterogeneous massive data, instantly sync intensive writes, Internet generic middleware design, optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 455
1080 Behavior and Strength of Slab-Edge Beam-Column Connections under Shear Force and Moment

Authors: Omar M. Ben-Sasi

Abstract:

A total of fourteen slab-edge beam-column connection specimens were tested gradually to failure under the effect of simultaneous action of shear force and moment. The objective was to investigate the influence of some parameters thought to be important on the behavior and strength of slab-column connections with edge beams encountered in flat slab flooring and roofing systems. The parameters included the existence and strength of edge beam, depth and width of edge beam, steel reinforcement ratio of slab, ratio of moment to shear force, and the existence of openings in the region next to the column.

Results obtained demonstrated the importance of the studied parameters on the strength and behavior of slab-column connections with edge beams.

Keywords: Strength, flat slab, slab-column connections, shear force, moment, behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4483
1079 PET/CT Patient Dosage Assay

Authors: Gulten Yilmaz, A. Beril Tugrul, Mustafa Demir, Dogan Yasar, Bayram Demir, Bulent Buyuk

Abstract:

A Positron Emission Tomography (PET) is a radioisotope imaging technique that illustrates the organs and the metabolisms of the human body. This technique is based on the simultaneous detection of 511 keV annihilation photons, annihilated as a result of electrons annihilating positrons that radiate from positron-emitting radioisotopes that enter biological active molecules in the body. This study was conducted on ten patients in an effort to conduct patient-related experimental studies. Dosage monitoring for the bladder, which was the organ that received the highest dose during PET applications, was conducted for 24 hours. Assessment based on measuring urination activities after injecting patients was also a part of this study. The MIRD method was used to conduct dosage calculations for results obtained from experimental studies. Results obtained experimentally and theoretically were assessed comparatively.

Keywords: PET/CT, TLD, MIRD, Dose measurement, Patient doses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1945
1078 Directional Drilling Optimization by Non-Rotating Stabilizer

Authors: Eisa Noveiri, Adel Taheri Nia

Abstract:

The Non-Rotating Adjustable Stabilizer / Directional Solution (NAS/DS) is the imitation of a mechanical process or an object by a directional drilling operation that causes a respond mathematically and graphically to data and decision to choose the best conditions compared to the previous mode. The NAS/DS Auto Guide rotary steerable tool is undergoing final field trials. The point-the-bit tool can use any bit, work at any rotating speed, work with any MWD/LWD system, and there is no pressure drop through the tool. It is a fully closed-loop system that automatically maintains a specified curvature rate. The Non–Rotating Adjustable stabilizer (NAS) can be controls curvature rate by exactly positioning and run with the optimum bit, use the most effective weight (WOB) and rotary speed (RPM) and apply all of the available hydraulic energy to the bit. The directional simulator allowed to specify the size of the curvature rate performance errors of the NAS tool and the magnitude of the random errors in the survey measurements called the Directional Solution (DS). The combination of these technologies (NAS/DS) will provide smoother bore holes, reduced drilling time, reduced drilling cost and incredible targeting precision. This simulator controls curvature rate by precisely adjusting the radial extension of stabilizer blades on a near bit Non-Rotating Stabilizer and control process corrects for the secondary effects caused by formation characteristics, bit and tool wear, and manufacturing tolerances.

Keywords: non-rotating, Adjustable stabilizer, simulator, Directional Drilling, optimization, Oil Well Drilling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3274
1077 Strategic Mine Planning: A SWOT Analysis Applied to KOV Open Pit Mine in the Democratic Republic of Congo

Authors: Patrick May Mukonki

Abstract:

KOV pit (Kamoto Oliveira Virgule) is located 10 km from Kolwezi town, one of the mineral rich town in the Lualaba province of the Democratic Republic of Congo. The KOV pit is currently operating under the Katanga Mining Limited (KML), a Glencore-Gecamines (a State Owned Company) join venture. Recently, the mine optimization process provided a life of mine of approximately 10 years withnice pushbacks using the Datamine NPV Scheduler software. In previous KOV pit studies, we recently outlined the impact of the accuracy of the geological information on a long-term mine plan for a big copper mine such as KOV pit. The approach taken, discussed three main scenarios and outlined some weaknesses on the geological information side, and now, in this paper that we are going to develop here, we are going to highlight, as an overview, those weaknesses, strengths and opportunities, in a global SWOT analysis. The approach we are taking here is essentially descriptive in terms of steps taken to optimize KOV pit and, at every step, we categorized the challenges we faced to have a better tradeoff between what we called strengths and what we called weaknesses. The same logic is applied in terms of the opportunities and threats. The SWOT analysis conducted in this paper demonstrates that, despite a general poor ore body definition, and very rude ground water conditions, there is room for improvement for such high grade ore body.

Keywords: Mine planning, mine optimization, mine scheduling, SWOT analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1589
1076 Internal Loading Distribution in Statically Loaded Ball Bearings, Subjected to a Combined Radial and Thrust Load, Including the Effects of Temperature and Fit

Authors: Mário C. Ricci

Abstract:

A new, rapidly convergent, numerical procedure for internal loading distribution computation in statically loaded, singlerow, angular-contact ball bearings, subjected to a known combined radial and thrust load, which must be applied so that to avoid tilting between inner and outer rings, is used to find the load distribution differences between a loaded unfitted bearing at room temperature, and the same loaded bearing with interference fits that might experience radial temperature gradients between inner and outer rings. For each step of the procedure it is required the iterative solution of Z + 2 simultaneous nonlinear equations – where Z is the number of the balls – to yield exact solution for axial and radial deflections, and contact angles.

Keywords: Ball, Bearing, Static, Load, Iterative, Numerical, Method, Temperature, Fit.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1786
1075 Autonomic Sonar Sensor Fault Manager for Mobile Robots

Authors: Martin Doran, Roy Sterritt, George Wilkie

Abstract:

NASA, ESA, and NSSC space agencies have plans to put planetary rovers on Mars in 2020. For these future planetary rovers to succeed, they will heavily depend on sensors to detect obstacles. This will also become of vital importance in the future, if rovers become less dependent on commands received from earth-based control and more dependent on self-configuration and self-decision making. These planetary rovers will face harsh environments and the possibility of hardware failure is high, as seen in missions from the past. In this paper, we focus on using Autonomic principles where self-healing, self-optimization, and self-adaption are explored using the MAPE-K model and expanding this model to encapsulate the attributes such as Awareness, Analysis, and Adjustment (AAA-3). In the experimentation, a Pioneer P3-DX research robot is used to simulate a planetary rover. The sonar sensors on the P3-DX robot are used to simulate the sensors on a planetary rover (even though in reality, sonar sensors cannot operate in a vacuum). Experiments using the P3-DX robot focus on how our software system can be adapted with the loss of sonar sensor functionality. The autonomic manager system is responsible for the decision making on how to make use of remaining ‘enabled’ sonars sensors to compensate for those sonar sensors that are ‘disabled’. The key to this research is that the robot can still detect objects even with reduced sonar sensor capability.

Keywords: Autonomic, self-adaption, self-healing, self-optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1002
1074 An Evaluation of Solubility of Wax and Asphaltene in Crude Oil for Improved Flow Properties Using a Copolymer Solubilized in Organic Solvent with an Aromatic Hydrocarbon

Authors: S. M. Anisuzzaman, Sariah Abang, Awang Bono, D. Krishnaiah, N. M. Ismail, G. B. Sandrison

Abstract:

Wax and asphaltene are high molecular weighted compounds that contribute to the stability of crude oil at a dispersed state. Transportation of crude oil along pipelines from the oil rig to the refineries causes fluctuation of temperature which will lead to the coagulation of wax and flocculation of asphaltenes. This paper focuses on the prevention of wax and asphaltene precipitate deposition on the inner surface of the pipelines by using a wax inhibitor and an asphaltene dispersant. The novelty of this prevention method is the combination of three substances; a wax inhibitor dissolved in a wax inhibitor solvent and an asphaltene solvent, namely, ethylene-vinyl acetate (EVA) copolymer dissolved in methylcyclohexane (MCH) and toluene (TOL) to inhibit the precipitation and deposition of wax and asphaltene. The objective of this paper was to optimize the percentage composition of each component in this inhibitor which can maximize the viscosity reduction of crude oil. The optimization was divided into two stages which are the laboratory experimental stage in which the viscosity of crude oil samples containing inhibitor of different component compositions is tested at decreasing temperatures and the data optimization stage using response surface methodology (RSM) to design an optimizing model. The results of experiment proved that the combination of 50% EVA + 25% MCH + 25% TOL gave a maximum viscosity reduction of 67% while the RSM model proved that the combination of 57% EVA + 20.5% MCH + 22.5% TOL gave a maximum viscosity reduction of up to 61%.

Keywords: Asphaltene, ethylene-vinyl acetate, methylcyclohexane, toluene, wax.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
1073 Solving Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms – Part II: Optimization

Authors: Wayan F. Mahmudy, Romeo M. Marian, Lee H. S. Luong

Abstract:

This paper presents modeling and optimization of two NP-hard problems in flexible manufacturing system (FMS), part type selection problem and loading problem. Due to the complexity and extent of the problems, the paper was split into two parts. The first part of the papers has discussed the modeling of the problems and showed how the real coded genetic algorithms (RCGA) can be applied to solve the problems. This second part discusses the effectiveness of the RCGA which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.

Keywords: Flexible manufacturing system, production planning, part type selection problem, loading problem, real-coded genetic algorithm

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1970
1072 Numerical Modeling of Waves and Currents by Using a Hydro-Sedimentary Model

Authors: Mustapha Kamel Mihoubi, Hocine Dahmani

Abstract:

Over recent years much progress has been achieved in the fields of numerical modeling shoreline processes: waves, currents, waves and current. However, there are still some problems in the existing models to link the on the first, the hydrodynamics of waves and currents and secondly, the sediment transport processes and due to the variability in time, space and interaction and the simultaneous action of wave-current near the shore. This paper is the establishment of a numerical modeling to forecast the sediment transport from development scenarios of harbor structure. It is established on the basis of a numerical simulation of a water-sediment model via a 2D model using a set of codes calculation MIKE 21-DHI software. This is to examine the effect of the sediment transport drivers following the dominant incident wave in the direction to pass input harbor work under different variants planning studies to find the technical and economic limitations to the sediment transport and protection of the harbor structure optimum solution.

Keywords: Swell, current, radiation, stress, mesh, MIKE21, sediment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1351
1071 Finite Element Analysis of Connecting Rod

Authors: Mohammed Mohsin Ali H., Mohamed Haneef

Abstract:

The connecting rod transmits the piston load to the crank causing the latter to turn, thus converting the reciprocating motion of the piston into a rotary motion of the crankshaft. Connecting rods are subjected to forces generated by mass and fuel combustion. This study investigates and compares the fatigue behavior of forged steel, powder forged and ASTM a 514 steel cold quenched connecting rods. The objective is to suggest for a new material with reduced weight and cost with the increased fatigue life. This has entailed performing a detailed load analysis. Therefore, this study has dealt with two subjects: first, dynamic load and stress analysis of the connecting rod, and second, optimization for material, weight and cost. In the first part of the study, the loads acting on the connecting rod as a function of time were obtained. Based on the observations of the dynamic FEA, static FEA, and the load analysis results, the load for the optimization study was selected. It is the conclusion of this study that the connecting rod can be designed and optimized under a load range comprising tensile load and compressive load. Tensile load corresponds to 360o crank angle at the maximum engine speed. The compressive load is corresponding to the peak gas pressure. Furthermore, the existing connecting rod can be replaced with a new connecting rod made of ASTM a 514 steel cold quenched that is 12% lighter and 28% cheaper.

Keywords: Connecting rod, ASTM a514 cold quenched steel, static analysis, fatigue analysis, stress life approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2732
1070 A Multiple-Objective Environmental Rationalization and Optimization for Material Substitution in the Production of Stone-Washed Jeans- Garments

Authors: Nabil A. Ibrahim, Nabil M. Abdel Moneim, Mohamed A. Ramadan, Marwa M. Hosni

Abstract:

As the Textile Industry is the second largest industry in Egypt and as small and medium-sized enterprises (SMEs) make up a great portion of this industry therein it is essential to apply the concept of Cleaner Production for the purpose of reducing pollution. In order to achieve this goal, a case study concerned with ecofriendly stone-washing of jeans-garments was investigated. A raw material-substitution option was adopted whereby the toxic potassium permanganate and sodium sulfide were replaced by the environmentally compatible hydrogen peroxide and glucose respectively where the concentrations of both replaced chemicals together with the operating time were optimized. In addition, a process-rationalization option involving four additional processes was investigated. By means of criteria such as product quality, effluent analysis, mass and heat balance; and cost analysis with the aid of a statistical model, a process optimization treatment revealed that the superior process optima were 50%, 0.15% and 50min for H2O2 concentration, glucose concentration and time, respectively. With these values the superior process ought to reduce the annual cost by about EGP 105 relative to the currently used conventional method.

Keywords: Cleaner Production, Eco-friendly of jeans garments, Stone washing, Textile Industry, Textile Wet Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2073
1069 Osmotic Dehydration of Beetroot in Salt Solution: Optimization of Parameters through Statistical Experimental Design

Authors: P. Manivannan, M. Rajasimman

Abstract:

Response surface methodology was used for quantitative investigation of water and solids transfer during osmotic dehydration of beetroot in aqueous solution of salt. Effects of temperature (25 – 45oC), processing time (30–150 min), salt concentration (5–25%, w/w) and solution to sample ratio (5:1 – 25:1) on osmotic dehydration of beetroot were estimated. Quadratic regression equations describing the effects of these factors on the water loss and solids gain were developed. It was found that effects of temperature and salt concentrations were more significant on the water loss than the effects of processing time and solution to sample ratio. As for solids gain processing time and salt concentration were the most significant factors. The osmotic dehydration process was optimized for water loss, solute gain, and weight reduction. The optimum conditions were found to be: temperature – 35oC, processing time – 90 min, salt concentration – 14.31% and solution to sample ratio 8.5:1. At these optimum values, water loss, solid gain and weight reduction were found to be 30.86 (g/100 g initial sample), 9.43 (g/100 g initial sample) and 21.43 (g/100 g initial sample) respectively.

Keywords: Optimization, Osmotic dehydration, Beetroot, saltsolution, response surface methodology

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3459
1068 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground

Authors: Bhim Kumar Dahal

Abstract:

Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies.  Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication.  And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.

Keywords: Embankment, ground improvement, modelling, model prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 952
1067 Detecting Geographically Dispersed Overlay Communities Using Community Networks

Authors: Madhushi Bandara, Dharshana Kasthurirathna, Danaja Maldeniya, Mahendra Piraveenan

Abstract:

Community detection is an extremely useful technique in understanding the structure and function of a social network. Louvain algorithm, which is based on Newman-Girman modularity optimization technique, is extensively used as a computationally efficient method extract the communities in social networks. It has been suggested that the nodes that are in close geographical proximity have a higher tendency of forming communities. Variants of the Newman-Girman modularity measure such as dist-modularity try to normalize the effect of geographical proximity to extract geographically dispersed communities, at the expense of losing the information about the geographically proximate communities. In this work, we propose a method to extract geographically dispersed communities while preserving the information about the geographically proximate communities, by analyzing the ‘community network’, where the centroids of communities would be considered as network nodes. We suggest that the inter-community link strengths, which are normalized over the community sizes, may be used to identify and extract the ‘overlay communities’. The overlay communities would have relatively higher link strengths, despite being relatively apart in their spatial distribution. We apply this method to the Gowalla online social network, which contains the geographical signatures of its users, and identify the overlay communities within it.

Keywords: Social networks, community detection, modularity optimization, geographically dispersed communities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1298
1066 Resource Leveling Optimization in Construction Projects of High Voltage Substations Using Nature-Inspired Intelligent Evolutionary Algorithms

Authors: Dimitrios Ntardas, Alexandros Tzanetos, Georgios Dounias

Abstract:

High Voltage Substations (HVS) are the intermediate step between production of power and successfully transmitting it to clients, making them one of the most important checkpoints in power grids. Nowadays - renewable resources and consequently distributed generation are growing fast, the construction of HVS is of high importance both in terms of quality and time completion so that new energy producers can quickly and safely intergrade in power grids. The resources needed, such as machines and workers, should be carefully allocated so that the construction of a HVS is completed on time, with the lowest possible cost (e.g. not spending additional cost that were not taken into consideration, because of project delays), but in the highest quality. In addition, there are milestones and several checkpoints to be precisely achieved during construction to ensure the cost and timeline control and to ensure that the percentage of governmental funding will be granted. The management of such a demanding project is a NP-hard problem that consists of prerequisite constraints and resource limits for each task of the project. In this work, a hybrid meta-heuristic method is implemented to solve this problem. Meta-heuristics have been proven to be quite useful when dealing with high-dimensional constraint optimization problems. Hybridization of them results in boost of their performance.

Keywords: High voltage substations, nature-inspired algorithms, project management, meta-heuristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1217
1065 Modeling and Optimization of Part Type Selection and Loading Problem in Flexible Manufacturing System Using Real Coded Genetic Algorithms

Authors: Wayan F. Mahmudy, Romeo M. Marian, Lee H. S. Luong

Abstract:

 This paper deals with modeling and optimization of two NP-hard problems in production planning of flexible manufacturing system (FMS), part type selection problem and loading problem. The part type selection problem and the loading problem are strongly related and heavily influence the system’s efficiency and productivity. These problems have been modeled and solved simultaneously by using real coded genetic algorithms (RCGA) which uses an array of real numbers as chromosome representation. The novel proposed chromosome representation produces only feasible solutions which minimize a computational time needed by GA to push its population toward feasible search space or repair infeasible chromosomes. The proposed RCGA improves the FMS performance by considering two objectives, maximizing system throughput and maintaining the balance of the system (minimizing system unbalance). The resulted objective values are compared to the optimum values produced by branch-and-bound method. The experiments show that the proposed RCGA could reach near optimum solutions in a reasonable amount of time.

Keywords: Flexible manufacturing system, production planning, part type selection problem, loading problem, real-coded genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2633
1064 Implementation of a Multimodal Biometrics Recognition System with Combined Palm Print and Iris Features

Authors: Rabab M. Ramadan, Elaraby A. Elgallad

Abstract:

With extensive application, the performance of unimodal biometrics systems has to face a diversity of problems such as signal and background noise, distortion, and environment differences. Therefore, multimodal biometric systems are proposed to solve the above stated problems. This paper introduces a bimodal biometric recognition system based on the extracted features of the human palm print and iris. Palm print biometric is fairly a new evolving technology that is used to identify people by their palm features. The iris is a strong competitor together with face and fingerprints for presence in multimodal recognition systems. In this research, we introduced an algorithm to the combination of the palm and iris-extracted features using a texture-based descriptor, the Scale Invariant Feature Transform (SIFT). Since the feature sets are non-homogeneous as features of different biometric modalities are used, these features will be concatenated to form a single feature vector. Particle swarm optimization (PSO) is used as a feature selection technique to reduce the dimensionality of the feature. The proposed algorithm will be applied to the Institute of Technology of Delhi (IITD) database and its performance will be compared with various iris recognition algorithms found in the literature.

Keywords: Iris recognition, particle swarm optimization, feature extraction, feature selection, palm print, scale invariant feature transform.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 884
1063 Automatic Inspection of Percussion Caps by Means of Combined 2D and 3D Machine Vision Techniques

Authors: A. Tellaeche, R. Arana, I.Maurtua

Abstract:

The exhaustive quality control is becoming more and more important when commercializing competitive products in the world's globalized market. Taken this affirmation as an undeniable truth, it becomes critical in certain sector markets that need to offer the highest restrictions in quality terms. One of these examples is the percussion cap mass production, a critical element assembled in firearm ammunition. These elements, built in great quantities at a very high speed, must achieve a minimum tolerance deviation in their fabrication, due to their vital importance in firing the piece of ammunition where they are built in. This paper outlines a machine vision development for the 100% inspection of percussion caps obtaining data from 2D and 3D simultaneous images. The acquisition speed and precision of these images from a metallic reflective piece as a percussion cap, the accuracy of the measures taken from these images and the multiple fabrication errors detected make the main findings of this work.

Keywords: critical tolerance, high speed decision makingsimultaneous 2D/3D machine vision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1537
1062 2D Gabor Functions and FCMI Algorithm for Flaws Detection in Ultrasonic Images

Authors: Kechida Ahmed, Drai Redouane, Khelil Mohamed

Abstract:

In this paper we present a new approach to detecting a flaw in T.O.F.D (Time Of Flight Diffraction) type ultrasonic image based on texture features. Texture is one of the most important features used in recognizing patterns in an image. The paper describes texture features based on 2D Gabor functions, i.e., Gaussian shaped band-pass filters, with dyadic treatment of the radial spatial frequency range and multiple orientations, which represent an appropriate choice for tasks requiring simultaneous measurement in both space and frequency domains. The most relevant features are used as input data on a Fuzzy c-mean clustering classifier. The classes that exist are only two: 'defects' or 'no defects'. The proposed approach is tested on the T.O.F.D image achieved at the laboratory and on the industrial field.

Keywords: 2D Gabor Functions, flaw detection, fuzzy c-mean clustering, non destructive testing, texture analysis, T.O.F.D Image (Time of Flight Diffraction).

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1752