Search results for: LMTD method
16643 Development and Total Error Concept Validation of Common Analytical Method for Quantification of All Residual Solvents Present in Amino Acids by Gas Chromatography-Head Space
Authors: A. Ramachandra Reddy, V. Murugan, Prema Kumari
Abstract:
Residual solvents in Pharmaceutical samples are monitored using gas chromatography with headspace (GC-HS). Based on current regulatory and compendial requirements, measuring the residual solvents are mandatory for all release testing of active pharmaceutical ingredients (API). Generally, isopropyl alcohol is used as the residual solvent in proline and tryptophan; methanol in cysteine monohydrate hydrochloride, glycine, methionine and serine; ethanol in glycine and lysine monohydrate; acetic acid in methionine. In order to have a single method for determining these residual solvents (isopropyl alcohol, ethanol, methanol and acetic acid) in all these 7 amino acids a sensitive and simple method was developed by using gas chromatography headspace technique with flame ionization detection. During development, no reproducibility, retention time variation and bad peak shape of acetic acid peaks were identified due to the reaction of acetic acid with the stationary phase (cyanopropyl dimethyl polysiloxane phase) of column and dissociation of acetic acid with water (if diluent) while applying temperature gradient. Therefore, dimethyl sulfoxide was used as diluent to avoid these issues. But most the methods published for acetic acid quantification by GC-HS uses derivatisation technique to protect acetic acid. As per compendia, risk-based approach was selected as appropriate to determine the degree and extent of the validation process to assure the fitness of the procedure. Therefore, Total error concept was selected to validate the analytical procedure. An accuracy profile of ±40% was selected for lower level (quantitation limit level) and for other levels ±30% with 95% confidence interval (risk profile 5%). The method was developed using DB-Waxetr column manufactured by Agilent contains 530 µm internal diameter, thickness: 2.0 µm, and length: 30 m. A constant flow of 6.0 mL/min. with constant make up mode of Helium gas was selected as a carrier gas. The present method is simple, rapid, and accurate, which is suitable for rapid analysis of isopropyl alcohol, ethanol, methanol and acetic acid in amino acids. The range of the method for isopropyl alcohol is 50ppm to 200ppm, ethanol is 50ppm to 3000ppm, methanol is 50ppm to 400ppm and acetic acid 100ppm to 400ppm, which covers the specification limits provided in European pharmacopeia. The accuracy profile and risk profile generated as part of validation were found to be satisfactory. Therefore, this method can be used for testing of residual solvents in amino acids drug substances.Keywords: amino acid, head space, gas chromatography, total error
Procedia PDF Downloads 14716642 Stochastic Simulation of Random Numbers Using Linear Congruential Method
Authors: Melvin Ballera, Aldrich Olivar, Mary Soriano
Abstract:
Digital computers nowadays must be able to have a utility that is capable of generating random numbers. Usually, computer-generated random numbers are not random given predefined values such as starting point and end points, making the sequence almost predictable. There are many applications of random numbers such business simulation, manufacturing, services domain, entertainment sector and other equally areas making worthwhile to design a unique method and to allow unpredictable random numbers. Applying stochastic simulation using linear congruential algorithm, it shows that as it increases the numbers of the seed and range the number randomly produced or selected by the computer becomes unique. If this implemented in an environment where random numbers are very much needed, the reliability of the random number is guaranteed.Keywords: stochastic simulation, random numbers, linear congruential algorithm, pseudorandomness
Procedia PDF Downloads 31516641 Environmental Life Cycle Assessment of Two Technologic Scenario of Wind Turbine Blades Composition for an Optimized Wind Turbine Design Using the Impact 2002+ Method and Using 15 Environmental Impact Indicators
Authors: A. Jarrou, A. Iranzo, C. Nana
Abstract:
The rapid development of the onshore/offshore wind industry and the continuous, strong, and long-term support from governments have made it possible to create factories specializing in the manufacture of the different parts of wind turbines, but in the literature, Life Cycle Assessment (LCA) analyzes consider the wind turbine as a whole and do not allow the allocation of impacts to the different components of the wind turbine. Here we propose to treat each part of the wind turbine as a system in its own right. This is more in line with the current production system. Environmental Life Cycle Assessment of two technological scenarios of wind turbine blades composition for an optimized wind turbine design using the impact 2002+ method and using 15 environmental impact indicators. This article aims to assess the environmental impacts associated with 1 kg of wind turbine blades. In order to carry out a realistic and precise study, the different stages of the life cycle of a wind turbine installation are included in the study (manufacture, installation, use, maintenance, dismantling, and waste treatment). The Impact 2002+ method used makes it possible to assess 15 impact indicators (human toxicity, terrestrial and aquatic ecotoxicity, climate change, land use, etc.). Finally, a sensitivity study is carried out to analyze the different types of uncertainties in the data collected.Keywords: life cycle assessment, wind turbine, turbine blade, environmental impact
Procedia PDF Downloads 17516640 A Novel PSO Based Decision Tree Classification
Authors: Ali Farzan
Abstract:
Classification of data objects or patterns is a major part in most of Decision making systems. One of the popular and commonly used classification methods is Decision Tree (DT). It is a hierarchical decision making system by which a binary tree is constructed and starting from root, at each node some of the classes is rejected until reaching the leaf nods. Each leaf node is a representative of one specific class. Finding the splitting criteria in each node for constructing or training the tree is a major problem. Particle Swarm Optimization (PSO) has been adopted as a metaheuristic searching method for finding the best splitting criteria. Result of evaluating the proposed method over benchmark datasets indicates the higher accuracy of the new PSO based decision tree.Keywords: decision tree, particle swarm optimization, splitting criteria, metaheuristic
Procedia PDF Downloads 40516639 Exact Solutions of a Nonlinear Schrodinger Equation with Kerr Law Nonlinearity
Authors: Muna Alghabshi, Edmana Krishnan
Abstract:
A nonlinear Schrodinger equation has been considered for solving by mapping methods in terms of Jacobi elliptic functions (JEFs). The equation under consideration has a linear evolution term, linear and nonlinear dispersion terms, the Kerr law nonlinearity term and three terms representing the contribution of meta materials. This equation which has applications in optical fibers is found to have soliton solutions, shock wave solutions, and singular wave solutions when the modulus of the JEFs approach 1 which is the infinite period limit. The equation with special values of the parameters has also been solved using the tanh method.Keywords: Jacobi elliptic function, mapping methods, nonlinear Schrodinger Equation, tanh method
Procedia PDF Downloads 31216638 Fast Algorithm to Determine Initial Tsunami Wave Shape at Source
Authors: Alexander P. Vazhenin, Mikhail M. Lavrentiev, Alexey A. Romanenko, Pavel V. Tatarintsev
Abstract:
One of the problems obstructing effective tsunami modelling is the lack of information about initial wave shape at source. The existing methods; geological, sea radars, satellite images, contain an important part of uncertainty. Therefore, direct measurement of tsunami waves obtained at the deep water bottom peruse recorders is also used. In this paper we propose a new method to reconstruct the initial sea surface displacement at tsunami source by the measured signal (marigram) approximation with the help of linear combination of synthetic marigrams from the selected set of unit sources, calculated in advance. This method has demonstrated good precision and very high performance. The mathematical model and results of numerical tests are here described.Keywords: numerical tests, orthogonal decomposition, Tsunami Initial Sea Surface Displacement
Procedia PDF Downloads 46716637 Improved Accuracy of Ratio Multiple Valuation
Authors: Julianto Agung Saputro, Jogiyanto Hartono
Abstract:
Multiple valuation is widely used by investors and practitioners but its accuracy is questionable. Multiple valuation inaccuracies are due to the unreliability of information used in valuation, inaccuracies comparison group selection, and use of individual multiple values. This study investigated the accuracy of valuation to examine factors that can increase the accuracy of the valuation of multiple ratios, that are discretionary accruals, the comparison group, and the composite of multiple valuation. These results indicate that multiple value adjustment method with discretionary accruals provides better accuracy, the industry comparator group method combined with the size and growth of companies also provide better accuracy. Composite of individual multiple valuation gives the best accuracy. If all of these factors combined, the accuracy of valuation of multiple ratios will give the best results.Keywords: multiple, valuation, composite, accuracy
Procedia PDF Downloads 27916636 Understanding Chances and Challenges of Family Planning: Qualitative Study in Indonesia's Banyumas District
Authors: Utsamani Cintyamena, Sandra Frans Olivia, Shita Lisyadewi, Ariane Utomo
Abstract:
Family planning is one of fundamental aspects in preventing maternal morbidity and mortality. However, the prevalence rate of Indonesia’s married women in choosing contraception is low. This study purpose to assess opportunities and challenges in family planning. Methodology: We conducted a qualitative study in Banyumas District which has huge reduction of maternal mortality rate from 2013 to 2015. Four focus group discussions and four small group discussions were conducted to assess knowledge and attitude of women in using contraceptive and their method of choice, as well as in-depth interview to four health workers and two family planning field officers as triangulation. Thematic content analysis was done manually. Results: Key themes emerge across interviews including (1) first choice of contraception is the one that they previously had, provided that they did not encountered problems with it, (2) rumor and fear of side effect affected their method of choice, (3) selection of contraceptive method was influenced by approval of husband, believes, and role model in community. Conclusion: Collaboration of health worker, family planning field officers, community, as well as support from stakeholder, must be increased to socializing family planning.Keywords: attitude, challenge, chance, family planning, knowledge
Procedia PDF Downloads 14616635 Stability Design by Geometrical Nonlinear Analysis Using Equivalent Geometric Imperfections
Authors: S. Fominow, C. Dobert
Abstract:
The present article describes the research that deals with the development of equivalent geometric imperfections for the stability design of steel members considering lateral-torsional buckling. The application of these equivalent imperfections takes into account the stiffness-reducing effects due to inelasticity and residual stresses, which lead to a reduction of the load carrying capacity of slender members and structures. This allows the application of a simplified design method, that is performed in three steps. Application of equivalent geometric imperfections, determination of internal forces using geometrical non-linear analysis (GNIA) and verification of the cross-section resistance at the most unfavourable location. All three verification steps are closely related and influence the results. The derivation of the equivalent imperfections was carried out in several steps. First, reference lateral-torsional buckling resistances for various rolled I-sections, slenderness grades, load shapes and steel grades were determined. This was done either with geometric and material non-linear analysis with geometrical imperfections and residual stresses (GMNIA) or for standard cases based on the equivalent member method. With the aim of obtaining identical lateral-torsional buckling resistances as the reference resistances from the application of the design method, the required sizes for equivalent imperfections were derived. For this purpose, a program based on the FEM method has been developed. Based on these results, several proposals for the specification of equivalent geometric imperfections have been developed. These differ in the shape of the applied equivalent geometric imperfection, the model of the cross-sectional resistance and the steel grade. The proposed design methods allow a wide range of applications and a reliable calculation of the lateral-torsional buckling resistances, as comparisons between the calculated resistances and the reference resistances have shown.Keywords: equivalent geometric imperfections, GMNIA, lateral-torsional buckling, non-linear finite element analysis
Procedia PDF Downloads 15416634 Contrasting The Water Consumption Estimation Methods
Authors: Etienne Alain Feukeu, L. W. Snyman
Abstract:
Water scarcity is becoming a real issue nowadays. Most countries in the world are facing it in their own way based on their own geographical coordinate and condition. Many countries are facing a challenge of a growing water demand as a result of not only an increased population, economic growth, but also as a pressure of the population dynamic and urbanization. In view to mitigate some of this related problem, an accurate method of water estimation and future prediction, forecast is essential to guarantee not only the sufficient quantity, but also a good water distribution and management system. Beside the fact that several works have been undertaken to address this concern, there is still a considerable disparity between different methods and standard used for water prediction and estimation. Hence this work contrast and compare two well-defined and established methods from two countries (USA and South Africa) to demonstrate the inconsistency when different method and standards are used interchangeably.Keywords: water scarcity, water estimation, water prediction, water forecast.
Procedia PDF Downloads 20016633 Ultra-Fast Growth of ZnO Nanorods from Aqueous Solution: Technology and Applications
Authors: Bartlomiej S. Witkowski, Lukasz Wachnicki, Sylwia Gieraltowska, Rafal Pietruszka, Marek Godlewski
Abstract:
Zinc oxide is extensively studied II-VI semiconductor with a direct energy gap of about 3.37 eV at room temperature and high transparency in visible light spectral region. Due to these properties, ZnO is an attractive material for applications in photovoltaic, electronic and optoelectronic devices. ZnO nanorods, due to a well-developed surface, have potential of applications in sensor technology and photovoltaics. In this work we present a new inexpensive method of the ultra-fast growth of ZnO nanorods from the aqueous solution. This environment friendly and fully reproducible method allows growth of nanorods in few minutes time on various substrates, without any catalyst or complexing agent. Growth temperature does not exceed 50ºC and growth can be performed at atmospheric pressure. The method is characterized by simplicity and allows regulation of size of the ZnO nanorods in a large extent. Moreover the method is also very safe, it requires organic, non-toxic and low-price precursors. The growth can be performed on almost any type of substrate through the homo-nucleation as well as hetero-nucleation. Moreover, received nanorods are characterized by a very high quality - they are monocrystalline as confirmed by XRD and transmission electron microscopy. Importantly oxygen vacancies are not found in the photoluminescence measurements. First results for obtained by us ZnO nanorods in sensor applications are very promising. Resistance UV sensor, based on ZnO nanorods grown on a quartz substrates shows high sensitivity of 20 mW/m2 (2 μW/cm2) for point contacts, especially that the results are obtained for the nanorods array, not for a single nanorod. UV light (below 400 nm of wavelength) generates electron-hole pairs, which results in a removal from the surfaces of the water vapor and hydroxyl groups. This reduces the depletion layer in nanorods, and thus lowers the resistance of the structure. The so-obtained sensor works at room temperature and does not need the annealing to reset to initial state. Details of the technology and the first sensors results will be presented. The obtained ZnO nanorods are also applied in simple-architecture photovoltaic cells (efficiency over 12%) in conjunction with low-price Si substrates and high-sensitive photoresistors. Details informations about technology and applications will be presented.Keywords: hydrothermal method, photoresistor, photovoltaic cells, ZnO nanorods
Procedia PDF Downloads 43116632 Metropolis-Hastings Sampling Approach for High Dimensional Testing Methods of Autonomous Vehicles
Authors: Nacer Eddine Chelbi, Ayet Bagane, Annie Saleh, Claude Sauvageau, Denis Gingras
Abstract:
As recently stated by National Highway Traffic Safety Administration (NHTSA), to demonstrate the expected performance of a highly automated vehicles system, test approaches should include a combination of simulation, test track, and on-road testing. In this paper, we propose a new validation method for autonomous vehicles involving on-road tests (Field Operational Tests), test track (Test Matrix) and simulation (Worst Case Scenarios). We concentrate our discussion on the simulation aspects, in particular, we extend recent work based on Importance Sampling by using a Metropolis-Hasting algorithm (MHS) to sample collected data from the Safety Pilot Model Deployment (SPMD) in lane-change scenarios. Our proposed MH sampling method will be compared to the Importance Sampling method, which does not perform well in high-dimensional problems. The importance of this study is to obtain a sampler that could be applied to high dimensional simulation problems in order to reduce and optimize the number of test scenarios that are necessary for validation and certification of autonomous vehicles.Keywords: automated driving, autonomous emergency braking (AEB), autonomous vehicles, certification, evaluation, importance sampling, metropolis-hastings sampling, tests
Procedia PDF Downloads 28616631 Particle Filter Supported with the Neural Network for Aircraft Tracking Based on Kernel and Active Contour
Authors: Mohammad Izadkhah, Mojtaba Hoseini, Alireza Khalili Tehrani
Abstract:
In this paper we presented a new method for tracking flying targets in color video sequences based on contour and kernel. The aim of this work is to overcome the problem of losing target in changing light, large displacement, changing speed, and occlusion. The proposed method is made in three steps, estimate the target location by particle filter, segmentation target region using neural network and find the exact contours by greedy snake algorithm. In the proposed method we have used both region and contour information to create target candidate model and this model is dynamically updated during tracking. To avoid the accumulation of errors when updating, target region given to a perceptron neural network to separate the target from background. Then its output used for exact calculation of size and center of the target. Also it is used as the initial contour for the greedy snake algorithm to find the exact target's edge. The proposed algorithm has been tested on a database which contains a lot of challenges such as high speed and agility of aircrafts, background clutter, occlusions, camera movement, and so on. The experimental results show that the use of neural network increases the accuracy of tracking and segmentation.Keywords: video tracking, particle filter, greedy snake, neural network
Procedia PDF Downloads 34016630 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification
Authors: Meimei Shi
Abstract:
Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus
Procedia PDF Downloads 13616629 A New Approach to Retrofit Steel Moment Resisting Frame Structures after Mainshock
Authors: Amir H. Farivarrad, Kiarash M. Dolatshahi
Abstract:
During earthquake events, aftershocks can significantly increase the probability of collapse of buildings, especially for those with induced damages during the mainshock. In this paper, a practical approach is proposed for seismic rehabilitation of mainshock-damaged buildings that can be easily implemented within few days after the mainshock. To show the efficacy of the proposed method, a case study nine story steel moment frame building is chosen which was designed to pre-Northridge codes. The collapse fragility curve for the aftershock is presented for both the retrofitted and non-retrofitted structures. Comparison of the collapse fragility curves shows that the proposed method is indeed applicable to reduce the seismic collapse risk.Keywords: aftershock, the collapse fragility curve, seismic rehabilitation, seismic retrofitting
Procedia PDF Downloads 43116628 Improvement of Piezoresistive Pressure Sensor Accuracy by Means of Current Loop Circuit Using Optimal Digital Signal Processing
Authors: Peter A. L’vov, Roman S. Konovalov, Alexey A. L’vov
Abstract:
The paper presents the advanced digital modification of the conventional current loop circuit for pressure piezoelectric transducers. The optimal DSP algorithms of current loop responses by the maximum likelihood method are applied for diminishing of measurement errors. The loop circuit has some additional advantages such as the possibility to operate with any type of resistance or reactance sensors, and a considerable increase in accuracy and quality of measurements to be compared with AC bridges. The results obtained are dedicated to replace high-accuracy and expensive measuring bridges with current loop circuits.Keywords: current loop, maximum likelihood method, optimal digital signal processing, precise pressure measurement
Procedia PDF Downloads 52716627 Biologically Inspired Small Infrared Target Detection Using Local Contrast Mechanisms
Authors: Tian Xia, Yuan Yan Tang
Abstract:
In order to obtain higher small target detection accuracy, this paper presents an effective algorithm inspired by the local contrast mechanism. The proposed method can enhance target signal and suppress background clutter simultaneously. In the first stage, a enhanced image is obtained using the proposed Weighted Laplacian of Gaussian. In the second stage, an adaptive threshold is adopted to segment the target. Experimental results on two changeling image sequences show that the proposed method can detect the bright and dark targets simultaneously, and is not sensitive to sea-sky line of the infrared image. So it is fit for IR small infrared target detection.Keywords: small target detection, local contrast, human vision system, Laplacian of Gaussian
Procedia PDF Downloads 46716626 New HCI Design Process Education
Authors: Jongwan Kim
Abstract:
Human Computer Interaction (HCI) is a subject covering the study, plan, and design of interactions between humans and computers. The prevalent use of digital mobile devices is increasing the need for education and research on HCI. This work is focused on a new education method geared towards reducing errors while developing application programs that incorporate role-changing brainstorming techniques during HCI design process. The proposed method has been applied to a capstone design course in the last spring semester. Students discovered some examples about UI design improvement and their error discovering and reducing capability was promoted. An UI design improvement, PC voice control for people with disabilities as an assistive technology examplar, will be presented. The improvement of these students' design ability will be helpful to the real field work.Keywords: HCI, design process, error reducing education, role-changing brainstorming, assistive technology
Procedia PDF Downloads 48916625 Cyclic Etching Process Using Inductively Coupled Plasma for Polycrystalline Diamond on AlGaN/GaN Heterostructure
Authors: Haolun Sun, Ping Wang, Mei Wu, Meng Zhang, Bin Hou, Ling Yang, Xiaohua Ma, Yue Hao
Abstract:
Gallium nitride (GaN) is an attractive material for next-generation power devices. It is noted that the performance of GaN-based high electron mobility transistors (HEMTs) is always limited by the self-heating effect. In response to the problem, integrating devices with polycrystalline diamond (PCD) has been demonstrated to be an efficient way to alleviate the self-heating issue of the GaN-based HEMTs. Among all the heat-spreading schemes, using PCD to cap the epitaxial layer before the HEMTs process is one of the most effective schemes. Now, the mainstream method of fabricating the PCD-capped HEMTs is to deposit the diamond heat-spreading layer on the AlGaN surface, which is covered by a thin nucleation dielectric/passivation layer. To achieve the pattern etching of the diamond heat spreader and device preparation, we selected SiN as the hard mask for diamond etching, which was deposited by plasma-enhanced chemical vapor deposition (PECVD). The conventional diamond etching method first uses F-based etching to remove the SiN from the special window region, followed by using O₂/Ar plasma to etch the diamond. However, the results of the scanning electron microscope (SEM) and focused ion beam microscopy (FIB) show that there are lots of diamond pillars on the etched diamond surface. Through our study, we found that it was caused by the high roughness of the diamond surface and the existence of the overlap between the diamond grains, which makes the etching of the SiN hard mask insufficient and leaves micro-masks on the diamond surface. Thus, a cyclic etching method was proposed to solve the problem of the residual SiN, which was left in the F-based etching. We used F-based etching during the first step to remove the SiN hard mask in the specific region; then, the O₂/Ar plasma was introduced to etch the diamond in the corresponding region. These two etching steps were set as one cycle. After the first cycle, we further used cyclic etching to clear the pillars, in which the F-based etching was used to remove the residual SiN, and then the O₂/Ar plasma was used to etch the diamond. Whether to take the next cyclic etching depends on whether there are still SiN micro-masks left. By using this method, we eventually achieved the self-terminated etching of the diamond and the smooth surface after the etching. These results demonstrate that the cyclic etching method can be successfully applied to the integrated preparation of polycrystalline diamond thin films and GaN HEMTs.Keywords: AlGaN/GaN heterojunction, O₂/Ar plasma, cyclic etching, polycrystalline diamond
Procedia PDF Downloads 13316624 Elicitation Methods of Requirements Gathering in Shopping Mobile Application Development
Authors: Xiao Yihong, Li Zhixuan, Wong Kah Seng, Shen Xingcang
Abstract:
Requirement Elicitation is one of the important factors in developing any new application. Most systems fail just because of wrong elicitation practice. As a result, developers always choose different methods in different fields to achieve optimal results. This paper analyses four cases to understand the effectiveness of different requirement elicitation methods in the field of mobile shopping applications. The elicitation methods we studied included interviews, questionnaires, prototypes, analysis of existing systems, focus groups, brainstorming, and so on. Through the research and analysis results, we ensured the need for a mixture of elicitation methods. Meanwhile, the method adopted should be determined according to the scale of the project and be operated in a reasonable order to ensure the high efficiency of requirement elicitation.Keywords: requirements elicitation method, shopping, mobile application, software requirement engineering
Procedia PDF Downloads 12316623 Wind Turbine Control Performance Evaluation Based on Minimum-Variance Principles
Authors: Zheming Cao
Abstract:
Control loops are the most important components in the wind turbine system. Product quality, operation safety, and the economic performance are directly or indirectly connected to the performance of control systems. This paper proposed a performance evaluation method based on minimum-variance for wind turbine control system. This method can be applied on PID controller for pitch control system in the wind turbine. The good performance result demonstrated in the paper was achieved by retuning and optimizing the controller settings based on the evaluation result. The concepts presented in this paper are illustrated with the actual data of the industrial wind farm.Keywords: control performance, evaluation, minimum-variance, wind turbine
Procedia PDF Downloads 36916622 Iterative Dynamic Programming for 4D Flight Trajectory Optimization
Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho
Abstract:
4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization
Procedia PDF Downloads 16016621 EMI Radiation Prediction and Final Measurement Process Optimization by Neural Network
Authors: Hussam Elias, Ninovic Perez, Holger Hirsch
Abstract:
The completion of the EMC regulations worldwide is growing steadily as the usage of electronics in our daily lives is increasing more than ever. In this paper, we introduce a novel method to perform the final phase of Electromagnetic compatibility (EMC) measurement and to reduce the required test time according to the norm EN 55032 by using a developed tool and the conventional neural network(CNN). The neural network was trained using real EMC measurements, which were performed in the Semi Anechoic Chamber (SAC) by CETECOM GmbH in Essen, Germany. To implement our proposed method, we wrote software to perform the radiated electromagnetic interference (EMI) measurements and use the CNN to predict and determine the position of the turntable that meets the maximum radiation value.Keywords: conventional neural network, electromagnetic compatibility measurement, mean absolute error, position error
Procedia PDF Downloads 19916620 Controlled Shock Response Spectrum Test on Spacecraft Subsystem Using Electrodynamic Shaker
Authors: M. Madheswaran, A. R. Prashant, S. Ramakrishna, V. Ramesh Naidu, P. Govindan, P. Aravindakshan
Abstract:
Shock Response spectrum (SRS) tests are one of the tests that are conducted on some critical systems of spacecraft as part of environmental testing. The SRS tests are conducted to simulate the pyro shocks that occur during launch phases as well as during deployment of spacecraft appendages. Some of the methods to carryout SRS tests are pyro technique method, impact hammer method, drop shock method and using electro dynamic shakers. The pyro technique, impact hammer and drop shock methods are open loop tests, whereas SRS testing using electrodynamic shaker is a controlled closed loop test. SRS testing using electrodynamic shaker offers various advantages such as simple test set up, better controllability and repeatability. However, it is important to devise a a proper test methodology so that safety of the electro dynamic shaker and that of test specimen are not compromised. This paper discusses the challenges that are involved in conducting SRS tests, shaker validation and the necessary precautions to be considered. Approach involved in choosing various test parameters like synthesis waveform, spectrum convergence level, etc., are discussed. A case study of SRS test conducted on an optical payload of Indian Geo stationary spacecraft is presented.Keywords: maxi-max spectrum, SRS (shock response spectrum), SDOf (single degree of freedom), wavelet synthesis
Procedia PDF Downloads 35816619 Development and Validation of a Rapid Turbidimetric Assay to Determine the Potency of Cefepime Hydrochloride in Powder Injectable Solution
Authors: Danilo F. Rodrigues, Hérida Regina N. Salgado
Abstract:
Introduction: The emergence of resistant microorganisms to a large number of clinically approved antimicrobials has been increasing, which restrict the options for the treatment of bacterial infections. As a strategy, drugs with high antimicrobial activities are in evidence. Stands out a class of antimicrobial, the cephalosporins, having as fourth generation cefepime (CEF) a semi-synthetic product which has activity against various Gram-positive bacteria (e.g. oxacillin resistant Staphylococcus aureus) and Gram-negative (e.g. Pseudomonas aeruginosa) aerobic. There are few studies in the literature regarding the development of microbiological methodologies for the analysis of this antimicrobial, so researches in this area are highly relevant to optimize the analysis of this drug in the industry and ensure the quality of the marketed product. The development of microbiological methods for the analysis of antimicrobials has gained strength in recent years and has been highlighted in relation to physicochemical methods, especially because they make possible to determine the bioactivity of the drug against a microorganism. In this context, the aim of this work was the development and validation of a microbiological method for quantitative analysis of CEF in powder lyophilized for injectable solution by turbidimetric assay. Method: For performing the method, Staphylococcus aureus ATCC 6538 IAL 2082 was used as the test microorganism and the culture medium chosen was the Casoy broth. The test was performed using temperature control (35.0 °C ± 2.0 °C) and incubated for 4 hours in shaker. The readings of the results were made at a wavelength of 530 nm through a spectrophotometer. The turbidimetric microbiological method was validated by determining the following parameters: linearity, precision (repeatability and intermediate precision), accuracy and robustness, according to ICH guidelines. Results and discussion: Among the parameters evaluated for method validation, the linearity showed results suitable for both statistical analyses as the correlation coefficients (r) that went 0.9990 for CEF reference standard and 0.9997 for CEF sample. The precision presented the following values 1.86% (intraday), 0.84% (interday) and 0.71% (between analyst). The accuracy of the method has been proven through the recovery test where the mean value obtained was 99.92%. The robustness was verified by the parameters changing volume of culture medium, brand of culture medium, incubation time in shaker and wavelength. The potency of CEF present in the samples of lyophilized powder for injectable solution was 102.46%. Conclusion: The turbidimetric microbiological method proposed for quantification of CEF in lyophilized powder for solution for injectable showed being fast, linear, precise, accurate and robust, being in accordance with all the requirements, which can be used in routine analysis of quality control in the pharmaceutical industry as an option for microbiological analysis.Keywords: cefepime hydrochloride, quality control, turbidimetric assay, validation
Procedia PDF Downloads 36016618 A Method for Quantifying Arsenolipids in Sea Water by HPLC-High Resolution Mass Spectrometry
Authors: Muslim Khan, Kenneth B. Jensen, Kevin A. Francesconi
Abstract:
Trace amounts (ca 1 µg/L, 13 nM) of arsenic are present in sea water mostly as the oxyanion arsenate. In contrast, arsenic is present in marine biota (animals and algae) at very high levels (up to100,000 µg/kg) a significant portion of which is present as lipid-soluble compounds collectively termed arsenolipids. The complex nature of sea water presents an analytical challenge to detect trace compounds and monitor their environmental path. We developed a simple method using liquid-liquid extraction combined with HPLC-High Resolution Mass Spectrometer capable of detecting trace of arsenolipids (99 % of the sample matrix while recovering > 80 % of the six target arsenolipids with limit of detection of 0.003 µg/L.)Keywords: arsenolipids, sea water, HPLC-high resolution mass spectrometry
Procedia PDF Downloads 36416617 Determining Optimal Number of Trees in Random Forests
Authors: Songul Cinaroglu
Abstract:
Background: Random Forest is an efficient, multi-class machine learning method using for classification, regression and other tasks. This method is operating by constructing each tree using different bootstrap sample of the data. Determining the number of trees in random forests is an open question in the literature for studies about improving classification performance of random forests. Aim: The aim of this study is to analyze whether there is an optimal number of trees in Random Forests and how performance of Random Forests differ according to increase in number of trees using sample health data sets in R programme. Method: In this study we analyzed the performance of Random Forests as the number of trees grows and doubling the number of trees at every iteration using “random forest” package in R programme. For determining minimum and optimal number of trees we performed Mc Nemar test and Area Under ROC Curve respectively. Results: At the end of the analysis it was found that as the number of trees grows, it does not always means that the performance of the forest is better than forests which have fever trees. In other words larger number of trees only increases computational costs but not increases performance results. Conclusion: Despite general practice in using random forests is to generate large number of trees for having high performance results, this study shows that increasing number of trees doesn’t always improves performance. Future studies can compare different kinds of data sets and different performance measures to test whether Random Forest performance results change as number of trees increase or not.Keywords: classification methods, decision trees, number of trees, random forest
Procedia PDF Downloads 39416616 Method for Selecting and Prioritising Smart Services in Manufacturing Companies
Authors: Till Gramberg, Max Kellner, Erwin Gross
Abstract:
This paper presents a comprehensive investigation into the topic of smart services and IIoT-Platforms, focusing on their selection and prioritization in manufacturing organizations. First, a literature review is conducted to provide a basic understanding of the current state of research in the area of smart services. Based on discussed and established definitions, a definition approach for this paper is developed. In addition, value propositions for smart services are identified based on the literature and expert interviews. Furthermore, the general requirements for the provision of smart services are presented. Subsequently, existing approaches for the selection and development of smart services are identified and described. In order to determine the requirements for the selection of smart services, expert opinions from successful companies that have already implemented smart services are collected through semi-structured interviews. Based on the results, criteria for the evaluation of existing methods are derived. The existing methods are then evaluated according to the identified criteria. Furthermore, a novel method for the selection of smart services in manufacturing companies is developed, taking into account the identified criteria and the existing approaches. The developed concept for the method is verified in expert interviews. The method includes a collection of relevant smart services identified in the literature. The actual relevance of the use cases in the industrial environment was validated in an online survey. The required data and sensors are assigned to the smart service use cases. The value proposition of the use cases is evaluated in an expert workshop using different indicators. Based on this, a comparison is made between the identified value proposition and the required data, leading to a prioritization process. The prioritization process follows an established procedure for evaluating technical decision-making processes. In addition to the technical requirements, the prioritization process includes other evaluation criteria such as the economic benefit, the conformity of the new service offering with the company strategy, or the customer retention enabled by the smart service. Finally, the method is applied and validated in an industrial environment. The results of these experiments are critically reflected upon and an outlook on future developments in the area of smart services is given. This research contributes to a deeper understanding of the selection and prioritization process as well as the technical considerations associated with smart service implementation in manufacturing organizations. The proposed method serves as a valuable guide for decision makers, helping them to effectively select the most appropriate smart services for their specific organizational needs.Keywords: smart services, IIoT, industrie 4.0, IIoT-platform, big data
Procedia PDF Downloads 8616615 Application of Support Vector Machines in Fault Detection and Diagnosis of Power Transmission Lines
Authors: I. A. Farhat, M. Bin Hasan
Abstract:
A developed approach for the protection of power transmission lines using Support Vector Machines (SVM) technique is presented. In this paper, the SVM technique is utilized for the classification and isolation of faults in power transmission lines. Accurate fault classification and location results are obtained for all possible types of short circuit faults. As in distance protection, the approach utilizes the voltage and current post-fault samples as inputs. The main advantage of the method introduced here is that the method could easily be extended to any power transmission line.Keywords: fault detection, classification, diagnosis, power transmission line protection, support vector machines (SVM)
Procedia PDF Downloads 55716614 Development of a Program for the Evaluation of Thermal Performance Applying the Centre Scientifique et Techniques du Bâtiment Method Case Study: Classroom
Authors: Iara Rezende, Djalma Silva, Alcino Costa Neto
Abstract:
Considering the transformations of the contemporary world linked to globalization and climate changes caused by global warming, the environmental and energy issues have been increasingly present in the decisions of the world scenario. Thus, the aim of reducing the impacts caused by human activities there are the energy efficiency measures, which are also applicable in the scope of Civil Engineering. Considering that a large part of the energy demand from buildings is related to the need to adapt the internal environment to the users comfort and productivity, measures capable of reducing this need can minimize the climate changes impacts and also the energy consumption of the building. However, these important measures are currently little used by civil engineers, either because of the interdisciplinarity of the subject, the time required to apply certain methods or the difficult interpretation of the results obtained by computational programs that often have a complex and little applied approach. Thus, it was proposed the development of a Java application with a simpler and applied approach to evaluate the thermal performance of a building in order to obtain results capable of assisting the civil engineers in the decision making related to the users thermal comfort. The program was built in Java programming language and the method used for the evaluation was the Center Scientifique et Technique du Batiment (CSTB) method. The program was used to evaluate the thermal performance of a university classroom. The analysis was carried out from simulations considering the worst climatic situation of the building occupation. Thus, at the end of the process, the favorable result was obtained regarding the classroom comfort zone and the feasibility of using the program, thus achieving the proposed objectives.Keywords: building occupation, CSTB method, energy efficiency measures, Java application, thermal comfort
Procedia PDF Downloads 130