Search results for: soft particle.
715 A Study on the Assessment of Prosthetic Infection after Total Knee Replacement Surgery
Authors: Chang, Chun-Lang, Liu, Chun-Kai
Abstract:
This study, for its research subjects, uses patients who had undergone total knee replacement surgery from the database of the National Health Insurance Administration. Through the review of literatures and the interviews with physicians, important factors are selected after careful screening. Then using Cross Entropy Method, Genetic Algorithm Logistic Regression, and Particle Swarm Optimization, the weight of each factor is calculated and obtained. In the meantime, Excel VBA and Case Based Reasoning are combined and adopted to evaluate the system. Results show no significant difference found through Genetic Algorithm Logistic Regression and Particle Swarm Optimization with over 97% accuracy in both methods. Both ROC areas are above 0.87. This study can provide critical reference to medical personnel as clinical assessment to effectively enhance medical care quality and efficiency, prevent unnecessary waste, and provide practical advantages to resource allocation to medical institutes.Keywords: Total knee replacement, Case Based Reasoning, Cross Entropy Method, Genetic Algorithm Logistic Regression, Particle Swarm Optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2033714 Meteorological Data Study and Forecasting Using Particle Swarm Optimization Algorithm
Authors: S. Esfandeh, M. Sedighizadeh
Abstract:
Weather systems use enormously complex combinations of numerical tools for study and forecasting. Unfortunately, due to phenomena in the world climate, such as the greenhouse effect, classical models may become insufficient mostly because they lack adaptation. Therefore, the weather forecast problem is matched for heuristic approaches, such as Evolutionary Algorithms. Experimentation with heuristic methods like Particle Swarm Optimization (PSO) algorithm can lead to the development of new insights or promising models that can be fine tuned with more focused techniques. This paper describes a PSO approach for analysis and prediction of data and provides experimental results of the aforementioned method on realworld meteorological time series.Keywords: Weather, Climate, PSO, Prediction, Meteorological
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075713 A Distributed Group Mutual Exclusion Algorithm for Soft Real Time Systems
Authors: Abhishek Swaroop, Awadhesh Kumar Singh
Abstract:
The group mutual exclusion (GME) problem is an interesting generalization of the mutual exclusion problem. Several solutions of the GME problem have been proposed for message passing distributed systems. However, none of these solutions is suitable for real time distributed systems. In this paper, we propose a token-based distributed algorithms for the GME problem in soft real time distributed systems. The algorithm uses the concepts of priority queue, dynamic request set and the process state. The algorithm uses first come first serve approach in selecting the next session type between the same priority levels and satisfies the concurrent occupancy property. The algorithm allows all n processors to be inside their CS provided they request for the same session. The performance analysis and correctness proof of the algorithm has also been included in the paper.Keywords: Concurrency, Group mutual exclusion, Priority, Request set, Token.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1712712 Lattice Boltzmann Simulation of the Carbonization of Wood Particle
Authors: Ahmed Mahmoudi, Imen Mejri, Mohamed A. Abbassi, Ahmed Omri
Abstract:
A numerical study based on the Lattice Boltzmann Method (LBM) is proposed to solve one, two and three dimensional heat and mass transfer for isothermal carbonization of thick wood particles. To check the validity of the proposed model, computational results have been compared with the published data and a good agreement is obtained. Then, the model is used to study the effect of reactor temperature and thermal boundary conditions, on the evolution of the local temperature and the mass distributions of the wood particle during carbonization
Keywords: Lattice Boltzmann Method, pyrolysis conduction, carbonization, Heat and mass transfer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2707711 Multiple Peaks Tracking Algorithm using Particle Swarm Optimization Incorporated with Artificial Neural Network
Authors: Mei Shan Ngan, Chee Wei Tan
Abstract:
Due to the non-linear characteristics of photovoltaic (PV) array, PV systems typically are equipped with the capability of maximum power point tracking (MPPT) feature. Moreover, in the case of PV array under partially shaded conditions, hotspot problem will occur which could damage the PV cells. Partial shading causes multiple peaks in the P-V characteristic curves. This paper presents a hybrid algorithm of Particle Swarm Optimization (PSO) and Artificial Neural Network (ANN) MPPT algorithm for the detection of global peak among the multiple peaks in order to extract the true maximum energy from PV panel. The PV system consists of PV array, dc-dc boost converter controlled by the proposed MPPT algorithm and a resistive load. The system was simulated using MATLAB/Simulink package. The simulation results show that the proposed algorithm performs well to detect the true global peak power. The results of the simulations are analyzed and discussed.Keywords: Photovoltaic (PV), Partial Shading, Maximum Power Point Tracking (MPPT), Particle Swarm Optimization (PSO) and Artificial Neural Network (ANN)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3756710 Performance Analysis of Reconstruction Algorithms in Diffuse Optical Tomography
Authors: K. Uma Maheswari, S. Sathiyamoorthy, G. Lakshmi
Abstract:
Diffuse Optical Tomography (DOT) is a non-invasive imaging modality used in clinical diagnosis for earlier detection of carcinoma cells in brain tissue. It is a form of optical tomography which produces gives the reconstructed image of a human soft tissue with by using near-infra-red light. It comprises of two steps called forward model and inverse model. The forward model provides the light propagation in a biological medium. The inverse model uses the scattered light to collect the optical parameters of human tissue. DOT suffers from severe ill-posedness due to its incomplete measurement data. So the accurate analysis of this modality is very complicated. To overcome this problem, optical properties of the soft tissue such as absorption coefficient, scattering coefficient, optical flux are processed by the standard regularization technique called Levenberg - Marquardt regularization. The reconstruction algorithms such as Split Bregman and Gradient projection for sparse reconstruction (GPSR) methods are used to reconstruct the image of a human soft tissue for tumour detection. Among these algorithms, Split Bregman method provides better performance than GPSR algorithm. The parameters such as signal to noise ratio (SNR), contrast to noise ratio (CNR), relative error (RE) and CPU time for reconstructing images are analyzed to get a better performance.
Keywords: Diffuse optical tomography, ill-posedness, Levenberg Marquardt method, Split Bregman, the Gradient projection for sparse reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618709 Characterization of HD-V2 Gafchromic Film for Measurement of Spatial Dose Distribution from Alpha Particle of 5.5 MeV
Authors: A. Aydarous, M. El Ghazaly
Abstract:
The purpose of this study was to investigate the response of the newly released Gafchromic HD-V2 film for alpha particle of 5.5 MeV. Gafchromic HD-V2 was exposed to alpha particles of energy 5 MeV from 241Am for different durations. Then the films were scanned with a flatbed scanner. The dose response curve up to 2200 Gy has been achieved. The film’s reproducibility and sensitivity were evaluated. The results obtained show that the net optical density increases almost exponentially with the increase in the exposure time, and it becomes saturated after prolonged exposure times. The red channel shows the highest sensitivity, with a value of 4 x 10-3 Gy-1 at netOD of 0.4. The inter-film reproducibility was measured and the relative uncertainty found was 1.7 %, 2.1 % and 2.3 % for grey, red and green channels, respectively.
Keywords: Alpha dosimetry, 241Am, Gafchromic film.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3142708 Generator Capability Curve Constraint for PSO Based Optimal Power Flow
Authors: Mat Syai'in, Adi Soeprijanto, Takashi Hiyama
Abstract:
An optimal power flow (OPF) based on particle swarm optimization (PSO) was developed with more realistic generator security constraint using the capability curve instead of only Pmin/Pmax and Qmin/Qmax. Neural network (NN) was used in designing digital capability curve and the security check algorithm. The algorithm is very simple and flexible especially for representing non linear generation operation limit near steady state stability limit and under excitation operation area. In effort to avoid local optimal power flow solution, the particle swarm optimization was implemented with enough widespread initial population. The objective function used in the optimization process is electric production cost which is dominated by fuel cost. The proposed method was implemented at Java Bali 500 kV power systems contain of 7 generators and 20 buses. The simulation result shows that the combination of generator power output resulted from the proposed method was more economic compared with the result using conventional constraint but operated at more marginal operating point.Keywords: Optimal Power Flow, Generator Capability Curve, Particle Swarm Optimization, Neural Network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2573707 Hydrodynamic Simulation of Fixed Bed GTL Reactor Using CFD
Authors: Sh. Shahhosseini, S. Alinia, M. Irani
Abstract:
In this work, axisymetric CFD simulation of fixed bed GTL reactor has been conducted, using computational fluid dynamics (CFD). In fixed bed CFD modeling, when N (tube-to-particle diameter ratio) has a large value, it is common to consider the packed bed as a porous media. Synthesis gas (a mixture of predominantly carbon monoxide and hydrogen) was fed to the reactor. The reactor length was 20 cm, divided to three sections. The porous zone was in the middle section of the reactor. The model equations were solved employing finite volume method. The effects of particle diameter, bed voidage, fluid velocity and bed length on pressure drop have been investigated. Simulation results showed these parameters could have remarkable impacts on the reactor pressure drop.Keywords: GTL Process, Fixed bed reactor, Pressure drop, CFDsimulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2370706 Transmission Lines Loading Enhancement Using ADPSO Approach
Authors: M. Mahdavi, H. Monsef, A. Bagheri
Abstract:
Discrete particle swarm optimization (DPSO) is a powerful stochastic evolutionary algorithm that is used to solve the large-scale, discrete and nonlinear optimization problems. However, it has been observed that standard DPSO algorithm has premature convergence when solving a complex optimization problem like transmission expansion planning (TEP). To resolve this problem an advanced discrete particle swarm optimization (ADPSO) is proposed in this paper. The simulation result shows that optimization of lines loading in transmission expansion planning with ADPSO is better than DPSO from precision view point.Keywords: ADPSO, TEP problem, Lines loading optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1618705 Resveratrol Incorporated Liposomes Prepared from Pegylated Phospholipids and Cholesterol
Authors: Mont Kumpugdee-Vollrath, Khaled Abdallah
Abstract:
Liposomes and pegylated liposomes were widely used as drug delivery system in pharmaceutical field since a long time. However, in the former time, polyethylene glycol (PEG) was connected into phospholipid after the liposomes were already prepared. In this paper, we intend to study the possibility of applying phospholipids which already connected with PEG and then they were used to prepare liposomes. The model drug resveratrol was used because it can be applied against different diseases. Cholesterol was applied to stabilize the membrane of liposomes. The thin film technique in a laboratory scale was a preparation method. The liposomes were then characterized by nanoparticle tracking analysis (NTA), photon correlation spectroscopy (PCS) and light microscopic techniques. The stable liposomes can be produced and the particle sizes after filtration were in nanometers. The 2- and 3-chains-PEG-phospholipid (PL) caused in smaller particle size than the 4-chains-PEG-PL. Liposomes from PL 90G and cholesterol were stable during storage at 8 °C of 56 days because the particle sizes measured by PCS were almost not changed. There was almost no leakage of resveratrol from liposomes PL 90G with cholesterol after diffusion test in dialysis tube for 28 days. All liposomes showed the sustained release during measuring time of 270 min. The maximum release amount of 16-20% was detected with liposomes from 2- and 3-chains-PEG-PL. The other liposomes gave max. release amount of resveratrol only of 10%. The release kinetic can be explained by Korsmeyer-Peppas equation.
Keywords: Liposome, NTA, resveratrol, pegylation, cholesterol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1063704 Software Effort Estimation Using Soft Computing Techniques
Authors: Parvinder S. Sandhu, Porush Bassi, Amanpreet Singh Brar
Abstract:
Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Keywords: Effort Estimation, Neural-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2074703 Minimum-Fuel Optimal Trajectory for Reusable First-Stage Rocket Landing Using Particle Swarm Optimization
Authors: Kevin Spencer G. Anglim, Zhenyu Zhang, Qingbin Gao
Abstract:
Reusable launch vehicles (RLVs) present a more environmentally-friendly approach to accessing space when compared to traditional launch vehicles that are discarded after each flight. This paper studies the recyclable nature of RLVs by presenting a solution method for determining minimum-fuel optimal trajectories using principles from optimal control theory and particle swarm optimization (PSO). This problem is formulated as a minimum-landing error powered descent problem where it is desired to move the RLV from a fixed set of initial conditions to three different sets of terminal conditions. However, unlike other powered descent studies, this paper considers the highly nonlinear effects caused by atmospheric drag, which are often ignored for studies on the Moon or on Mars. Rather than optimizing the controls directly, the throttle control is assumed to be bang-off-bang with a predetermined thrust direction for each phase of flight. The PSO method is verified in a one-dimensional comparison study, and it is then applied to the two-dimensional cases, the results of which are illustrated.Keywords: Minimum-fuel optimal trajectory, particle swarm optimization, reusable rocket, SpaceX.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2013702 Effect of Changing Iron Content and Excitation Frequency on Magnetic Particle Imaging Signal: A Comparative Study of Synomag® Nanoparticles
Authors: Kalthoum Riahi, Max T. Rietberg, Javier Perez y Perez, Corné Dijkstra, Bennie ten Haken, Lejla Alic
Abstract:
Magnetic nanoparticles (MNPs) are widely used to facilitate magnetic particle imaging (MPI) which has the potential to become the leading diagnostic instrument for biomedical imaging. This comparative study assesses the effects of changing iron content and excitation frequency on point-spread function (PSF) representing the effect of magnetization reversal. PSF is quantified by features of interest for MPI: i.e., drive field amplitude and full-width-at-half-maximum (FWHM). A superparamagnetic quantifier (SPaQ) is used to assess differential magnetic susceptibility of two commercially available MNPs: Synomag®-D50 and Synomag®-D70. For both MNPs, the signal output depends on increase in drive field frequency and amount of iron-oxide, which might be hampering the sensitivity of MPI systems that perform on higher frequencies. Nevertheless, there is a clear potential of Synomag®-D for a stable MPI resolution, especially in case of 70 nm version, that is independent of either drive field frequency or amount of iron-oxide.
Keywords: Magnetic nanoparticles, MNPs, Differential magnetic susceptibility, DMS, Magnetic particle imaging, MPI, magnetic relaxation, Synomag®-D.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 707701 A Decision Boundary based Discretization Technique using Resampling
Authors: Taimur Qureshi, Djamel A Zighed
Abstract:
Many supervised induction algorithms require discrete data, even while real data often comes in a discrete and continuous formats. Quality discretization of continuous attributes is an important problem that has effects on speed, accuracy and understandability of the induction models. Usually, discretization and other types of statistical processes are applied to subsets of the population as the entire population is practically inaccessible. For this reason we argue that the discretization performed on a sample of the population is only an estimate of the entire population. Most of the existing discretization methods, partition the attribute range into two or several intervals using a single or a set of cut points. In this paper, we introduce a technique by using resampling (such as bootstrap) to generate a set of candidate discretization points and thus, improving the discretization quality by providing a better estimation towards the entire population. Thus, the goal of this paper is to observe whether the resampling technique can lead to better discretization points, which opens up a new paradigm to construction of soft decision trees.Keywords: Bootstrap, discretization, resampling, soft decision trees.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1433700 On Discretization of Second-order Derivatives in Smoothed Particle Hydrodynamics
Authors: R. Fatehi, M.A. Fayazbakhsh, M.T. Manzari
Abstract:
Discretization of spatial derivatives is an important issue in meshfree methods especially when the derivative terms contain non-linear coefficients. In this paper, various methods used for discretization of second-order spatial derivatives are investigated in the context of Smoothed Particle Hydrodynamics. Three popular forms (i.e. "double summation", "second-order kernel derivation", and "difference scheme") are studied using one-dimensional unsteady heat conduction equation. To assess these schemes, transient response to a step function initial condition is considered. Due to parabolic nature of the heat equation, one can expect smooth and monotone solutions. It is shown, however in this paper, that regardless of the type of kernel function used and the size of smoothing radius, the double summation discretization form leads to non-physical oscillations which persist in the solution. Also, results show that when a second-order kernel derivative is used, a high-order kernel function shall be employed in such a way that the distance of inflection point from origin in the kernel function be less than the nearest particle distance. Otherwise, solutions may exhibit oscillations near discontinuities unlike the "difference scheme" which unconditionally produces monotone results.Keywords: Heat conduction, Meshfree methods, Smoothed ParticleHydrodynamics (SPH), Second-order derivatives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3087699 Particle Swarm Optimization Based Interconnected Hydro-Thermal AGC System Considering GRC and TCPS
Authors: Banaja Mohanty, Prakash Kumar Hota
Abstract:
This paper represents performance of particle swarm optimisation (PSO) algorithm based integral (I) controller and proportional-integral controller (PI) for interconnected hydro-thermal automatic generation control (AGC) with generation rate constraint (GRC) and Thyristor controlled phase shifter (TCPS) in series with tie line. The control strategy of TCPS provides active control of system frequency. Conventional objective function integral square error (ISE) and another objective function considering square of derivative of change in frequencies of both areas and change in tie line power are considered. The aim of designing the objective function is to suppress oscillation in frequency deviations and change in tie line power oscillation. The controller parameters are searched by PSO algorithm by minimising the objective functions. The dynamic performance of the controllers I and PI, for both the objective functions, are compared with conventionally optimized I controller.
Keywords: Automatic generation control (AGC), Generation rate constraint (GRC), Thyristor control phase shifter (TCPS), Particle swarm optimization (PSO).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2175698 Impact of Ship Traffic to PM2.5 and Particle Number Concentrations in Three Port-Cities of the Adriatic/Ionian Area
Authors: Daniele Contini, Antonio Donateo, Andrea Gambaro, Athanasios Argiriou, Dimitrios Melas, Daniela Cesari, Anastasia Poupkou, Athanasios Karagiannidis, Apostolos Tsakis, Eva Merico, Rita Cesari, Adelaide Dinoi
Abstract:
Emissions of atmospheric pollutants from ships and harbour activities are a growing concern at international level given their potential impacts on air quality and climate. These close-to-land emissions have potential impact on local communities in terms of air quality and health. Recent studies show that the impact of maritime traffic to atmospheric particulate matter concentrations in several coastal urban areas is comparable with the impact of road traffic of a medium size town. However, several different approaches have been used for these estimates making difficult a direct comparison of results. In this work, an integrated approach based on emission inventories and dedicated measurement campaigns has been applied to give a comparable estimate of the impact of maritime traffic to PM2.5 and particle number concentrations in three major harbours of the Adriatic/Ionian Seas. The influences of local meteorology and of the logistic layout of the harbours are discussed.
Keywords: Ship emissions, PM2.5, particle number concentrations, impact of shipping to atmospheric aerosol.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2396697 Solid State Fermentation of Cassava Peel with Trichoderma viride (ATCC 36316) for Protein Enrichment
Authors: Olufunke O. Ezekiel, Ogugua C. Aworh
Abstract:
Solid state fermentation of cassava peel with emphasis on protein enrichment using Trichoderma viride was evaluated. The effect of five variables: moisture content, pH, particle size (p), nitrogen source and incubation temperature; on the true protein and total sugars of cassava peel was investigated. The optimum fermentation period was established to be 8 days. Total sugars were 5-fold higher at pH 6 relative to pH 4 and 7-fold higher when cassava peels were fermented at 30oC relative to 25oC as well as using ammonium sulfate as the nitrogen source relative to urea or a combination of both. Total sugars ranged between 123.21mg/g at 50% initial moisture content to 374mg/g at 60% and from 190.59mg/g with particle size range of 2.00>p>1.41mm to 310.10mg/g with 4.00>p>3.35mm.True protein ranged from 229.70 mg/g at pH 4 to 284.05 mg/g at pH 6; from 200.87 mg/g with urea as nitrogen source and to 254.50mg/g with ammonium sulfate; from 213.82mg/g at 50% initial moisture content to 254.50mg/g at 60% moisture content, from 205.75mg/g in cassava peel with 5.6>p> 4.75mm to 268.30 in cassava peel with particle size 4.00>p>3.35mm, from 207.57mg/g at 25oC to 254.50mg/g at 30oC Cassava peel with particle size 4.00>p>3.35 mm and initial moisture content of 60% at pH 6.0, 30oC incubation temperature with ammonium sulfate (10g N / kg substrate) was most suitable for protein enrichment with Trichoderma viride. Crude protein increased from 4.21 % in unfermented cassava peel samples to 10.43 % in fermented samples.
Keywords: Cassava peel, Solid state fermentation, Trichoderma viride, Total sugars, True protein.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3347696 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration
Authors: C. Iraklis, G. Evmiridis, A. Iraklis
Abstract:
Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.
Keywords: Congestion, distribution networks, loss reduction, particle swarm optimization, smart grid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 748695 Optimal Location of Multi Type Facts Devices for Multiple Contingencies Using Particle Swarm Optimization
Authors: S. Sutha, N. Kamaraj
Abstract:
In deregulated operating regime power system security is an issue that needs due thoughtfulness from researchers in the horizon of unbundling of generation and transmission. Electric power systems are exposed to various contingencies. Network contingencies often contribute to overloading of branches, violation of voltages and also leading to problems of security/stability. To maintain the security of the systems, it is desirable to estimate the effect of contingencies and pertinent control measurement can be taken on to improve the system security. This paper presents the application of particle swarm optimization algorithm to find the optimal location of multi type FACTS devices in a power system in order to eliminate or alleviate the line over loads. The optimizations are performed on the parameters, namely the location of the devices, their types, their settings and installation cost of FACTS devices for single and multiple contingencies. TCSC, SVC and UPFC are considered and modeled for steady state analysis. The selection of UPFC and TCSC suitable location uses the criteria on the basis of improved system security. The effectiveness of the proposed method is tested for IEEE 6 bus and IEEE 30 bus test systems.
Keywords: Contingency Severity Index, Particle Swarm Optimization, Performance Index, Static Security Assessment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2765694 Chemical Characterization of Submicron Aerosol in Kanpur Region: a Source Apportionment Study
Authors: A. Chakraborty, T. Gupta
Abstract:
Several studies have shown the association between ambient particulate matter (PM) and adverse health effects and climate change, thus highlighting the need to limit the anthropogenic sources of PM. PM Exposure is commonly monitored as mass concentration of PM10 (particle aerodynamic diameter < 10μm) or PM2.5 (particle aerodynamic diameter < 2.5μm), although increasing toxicity with decreasing aerodynamic diameter has been reported due to increased surface area and enhanced chemical reactivity with other species. Additionally, the light scattering properties of PM increases with decreasing size. Hence, it is important to study the chemical characterization of finer fraction of the particulate matter and to identify their sources so that they can be controlled appropriately to a large extent at the sources before reaching to the receptors.Keywords: PM1, PCA, source apportionment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1650693 Certain Data Dimension Reduction Techniques for application with ANN based MCS for Study of High Energy Shower
Authors: Gitanjali Devi, Kandarpa Kumar Sarma, Pranayee Datta, Anjana Kakoti Mahanta
Abstract:
Cosmic showers, from their places of origin in space, after entering earth generate secondary particles called Extensive Air Shower (EAS). Detection and analysis of EAS and similar High Energy Particle Showers involve a plethora of experimental setups with certain constraints for which soft-computational tools like Artificial Neural Network (ANN)s can be adopted. The optimality of ANN classifiers can be enhanced further by the use of Multiple Classifier System (MCS) and certain data - dimension reduction techniques. This work describes the performance of certain data dimension reduction techniques like Principal Component Analysis (PCA), Independent Component Analysis (ICA) and Self Organizing Map (SOM) approximators for application with an MCS formed using Multi Layer Perceptron (MLP), Recurrent Neural Network (RNN) and Probabilistic Neural Network (PNN). The data inputs are obtained from an array of detectors placed in a circular arrangement resembling a practical detector grid which have a higher dimension and greater correlation among themselves. The PCA, ICA and SOM blocks reduce the correlation and generate a form suitable for real time practical applications for prediction of primary energy and location of EAS from density values captured using detectors in a circular grid.Keywords: EAS, Shower, Core, ANN, Location.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608692 An Erosion-based Modeling of Abrasive Waterjet Turning
Authors: I. Zohourkari, M. Zohoor
Abstract:
In this paper, an erosion-based model for abrasive waterjet (AWJ) turning process is presented. By using modified Hashish erosion model, the volume of material removed by impacting of abrasive particles to surface of the rotating cylindrical specimen is estimated and radius reduction at each rotation is calculated. Different to previous works, the proposed model considers the continuous change in local impact angle due to change in workpiece diameter, axial traverse rate of the jet, the abrasive particle roundness and density. The accuracy of the proposed model is examined by experimental tests under various traverse rates. The final diameters estimated by the proposed model are in good accordance with experiments.Keywords: Abrasive, Erosion, impact, Particle, Waterjet, Turning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2421691 SPH Method used for Flow Predictions at a Turgo Impulse Turbine: Comparison with Fluent
Authors: Phoevos K. Koukouvinis, John S. Anagnostopoulos, Dimitris E. Papantonis
Abstract:
This work is an attempt to use the standard Smoothed Particle Hydrodynamics methodology for the simulation of the complex unsteady, free-surface flow in a rotating Turgo impulse water turbine. A comparison of two different geometries was conducted. The SPH method due to its mesh-less nature is capable of capturing the flow features appearing in the turbine, without diffusion at the water/air interface. Furthermore results are compared with a commercial CFD package (Fluent®) and the SPH algorithm proves to be capable of providing similar results, in much less time than the mesh based CFD program. A parametric study was also performed regarding the turbine inlet angle.Keywords: Smoothed Particle Hydrodynamics, Mesh-lessmethods, Impulse turbines, Turgo turbine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2640690 Effects of Particle Size Distribution of Binders on the Performance of Slag-Limestone Ternary Cement
Authors: Zhuomin Zou, Thijs Van Landeghem, Elke Gruyaert
Abstract:
Using supplementary cementitious materials, such as ground granulated blast-furnace slag (GGBFS) and limestone to replace Portland cement (PC) is a promising method to reduce the carbon emissions from cement production. To efficiently use GGBFS and limestone, it is necessary to carefully select the particle size distribution (PSD) of the binders. This study investigated the effects of the PSD of binders on the performance of slag-limestone ternary cement. Based on the PSD parameters of the binders, three types of ternary cements with a similar overall PSD were designed, i.e., No.1 fine GGBFS, medium PC, and coarse limestone; No.2 fine limestone, medium PC, and coarse GGBFS; No.3. fine PC, medium GGBFS, and coarse limestone. The binder contents in the ternary cements were 50% PC, 40% slag, and 10% limestone. The mortar performance of the three ternary cements was investigated in terms of flow table value, strength at 28 days, carbonation resistance and non-steady state chloride migration resistance at 28 days. Results show that ternary cement with fine limestone (No.2) has the weakest performance among the three ternary cements. Ternary cements with fine slag (No.1) show an overall comparable performance to ternary cement with fine PC (No.3). Moreover, the chloride migration coefficient of ternary cements with fine slag (No.1) is significantly lower than the other two ternary cements.
Keywords: Limestone, particle size distribution, slag, ternary cement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 355689 Constrained Particle Swarm Optimization of Supply Chains
Authors: András Király, Tamás Varga, János Abonyi
Abstract:
Since supply chains highly impact the financial performance of companies, it is important to optimize and analyze their Key Performance Indicators (KPI). The synergistic combination of Particle Swarm Optimization (PSO) and Monte Carlo simulation is applied to determine the optimal reorder point of warehouses in supply chains. The goal of the optimization is the minimization of the objective function calculated as the linear combination of holding and order costs. The required values of service levels of the warehouses represent non-linear constraints in the PSO. The results illustrate that the developed stochastic simulator and optimization tool is flexible enough to handle complex situations.Keywords: stochastic processes, empirical distributions, Monte Carlo simulation, PSO, supply chain management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2075688 Coordinated Design of TCSC Controller and PSS Employing Particle Swarm Optimization Technique
Authors: Sidhartha Panda, N. P. Padhy
Abstract:
This paper investigates the application of Particle Swarm Optimization (PSO) technique for coordinated design of a Power System Stabilizer (PSS) and a Thyristor Controlled Series Compensator (TCSC)-based controller to enhance the power system stability. The design problem of PSS and TCSC-based controllers is formulated as a time domain based optimization problem. PSO algorithm is employed to search for optimal controller parameters. By minimizing the time-domain based objective function, in which the deviation in the oscillatory rotor speed of the generator is involved; stability performance of the system is improved. To compare the capability of PSS and TCSC-based controller, both are designed independently first and then in a coordinated manner for individual and coordinated application. The proposed controllers are tested on a weakly connected power system. The eigenvalue analysis and non-linear simulation results are presented to show the effectiveness of the coordinated design approach over individual design. The simulation results show that the proposed controllers are effective in damping low frequency oscillations resulting from various small disturbances like change in mechanical power input and reference voltage setting.Keywords: Particle swarm optimization, Phillips-Heffron model, power system stability, PSS, TCSC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2158687 Spread Spectrum Code Estimationby Particle Swarm Algorithm
Authors: Vahid R. Asghari, Mehrdad Ardebilipour
Abstract:
In the context of spectrum surveillance, a new method to recover the code of spread spectrum signal is presented, while the receiver has no knowledge of the transmitter-s spreading sequence. In our previous paper, we used Genetic algorithm (GA), to recover spreading code. Although genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems, but nonetheless, by increasing the length of the code, we will often lead to an unacceptable slow convergence speed. To solve this problem we introduce Particle Swarm Optimization (PSO) into code estimation in spread spectrum communication system. In searching process for code estimation, the PSO algorithm has the merits of rapid convergence to the global optimum, without being trapped in local suboptimum, and good robustness to noise. In this paper we describe how to implement PSO as a component of a searching algorithm in code estimation. Swarm intelligence boasts a number of advantages due to the use of mobile agents. Some of them are: Scalability, Fault tolerance, Adaptation, Speed, Modularity, Autonomy, and Parallelism. These properties make swarm intelligence very attractive for spread spectrum code estimation. They also make swarm intelligence suitable for a variety of other kinds of channels. Our results compare between swarm-based algorithms and Genetic algorithms, and also show PSO algorithm performance in code estimation process.Keywords: Code estimation, Particle Swarm Optimization(PSO), Spread spectrum.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2135686 Schrödinger Equation with Position-Dependent Mass: Staggered Mass Distributions
Authors: J. J. Peña, J. Morales, J. García-Ravelo, L. Arcos-Díaz
Abstract:
The Point canonical transformation method is applied for solving the Schrödinger equation with position-dependent mass. This class of problem has been solved for continuous mass distributions. In this work, a staggered mass distribution for the case of a free particle in an infinite square well potential has been proposed. The continuity conditions as well as normalization for the wave function are also considered. The proposal can be used for dealing with other kind of staggered mass distributions in the Schrödinger equation with different quantum potentials.
Keywords: Free particle, point canonical transformation method, position-dependent mass, staggered mass distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570