Search results for: Analytical methodology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7184

Search results for: Analytical methodology

7004 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 40
7003 Optimization of Wear during Dry Sliding Wear of AISI 1042 Steel Using Response Surface Methodology

Authors: Sukant Mehra, Parth Gupta, Varun Arora, Sarvoday Singh, Amit Kohli

Abstract:

The study was emphasised on dry sliding wear behavior of AISI 1042 steel. Dry sliding wear tests were performed using pin-on-disk apparatus under normal loads of 5, 7.5 and 10 kgf and at speeds 600, 750 and 900 rpm. Response surface methodology (RSM) was utilized for finding optimal values of process parameter and experiment was based on rotatable, central composite design (CCD). It was found that the wear followed linear pattern with the load and rpm. The obtained optimal process parameters have been predicted and verified by confirmation experiments.

Keywords: central composite design (CCD), optimization, response surface methodology (RSM), wear

Procedia PDF Downloads 545
7002 An Analytical Wall Function for 2-D Shock Wave/Turbulent Boundary Layer Interactions

Authors: X. Wang, T. J. Craft, H. Iacovides

Abstract:

When handling the near-wall regions of turbulent flows, it is necessary to account for the viscous effects which are important over the thin near-wall layers. Low-Reynolds- number turbulence models do this by including explicit viscous and also damping terms which become active in the near-wall regions, and using very fine near-wall grids to properly resolve the steep gradients present. In order to overcome the cost associated with the low-Re turbulence models, a more advanced wall function approach has been implemented within OpenFoam and tested together with a standard log-law based wall function in the prediction of flows which involve 2-D shock wave/turbulent boundary layer interactions (SWTBLIs). On the whole, from the calculation of the impinging shock interaction, the three turbulence modelling strategies, the Lauder-Sharma k-ε model with Yap correction (LS), the high-Re k-ε model with standard wall function (SWF) and analytical wall function (AWF), display good predictions of wall-pressure. However, the SWF approach tends to underestimate the tendency of the flow to separate as a result of the SWTBLI. The analytical wall function, on the other hand, is able to reproduce the shock-induced flow separation and returns predictions similar to those of the low-Re model, using a much coarser mesh.

Keywords: SWTBLIs, skin-friction, turbulence modeling, wall function

Procedia PDF Downloads 317
7001 Defining Methodology for Multi Model Software Process Improvement Framework

Authors: Aedah Abd Rahman

Abstract:

Software organisations may implement single or multiple frameworks in order to remain competitive. There are wide selection of generic Software Process Improvement (SPI) frameworks, best practices and standards implemented with different focuses and goals. Issues and difficulties emerge in the SPI practices from the context of software development and IT Service Management (ITSM). This research looks into the integration of multiple frameworks from the perspective of software development and ITSM. The research question of this study is how to define steps of methodology to solve the multi model software process improvement problem. The objective of this study is to define the research approach and methodologies to produce a more integrated and efficient Multi Model Process Improvement (MMPI) solution. A multi-step methodology is used which contains the case study, framework mapping and Delphi study. The research outcome has proven the usefulness and appropriateness of the proposed framework in SPI and quality practice in Malaysian software industries. This mixed method research approach is used to tackle problems from every angle in the context of software development and services. This methodology is used to facilitate the implementation and management of multi model environment of SPI frameworks in multiple domains.

Keywords: Delphi study, methodology, multi model software process improvement, service management

Procedia PDF Downloads 236
7000 An Analytical Method for Solving General Riccati Equation

Authors: Y. Pala, M. O. Ertas

Abstract:

In this paper, the general Riccati equation is analytically solved by a new transformation. By the method developed, looking at the transformed equation, whether or not an explicit solution can be obtained is readily determined. Since the present method does not require a proper solution for the general solution, it is especially suitable for equations whose proper solutions cannot be seen at first glance. Since the transformed second order linear equation obtained by the present transformation has the simplest form that it can have, it is immediately seen whether or not the original equation can be solved analytically. The present method is exemplified by several examples.

Keywords: Riccati equation, analytical solution, proper solution, nonlinear

Procedia PDF Downloads 324
6999 Fully Coupled Porous Media Model

Authors: Nia Mair Fry, Matthew Profit, Chenfeng Li

Abstract:

This work focuses on the development and implementation of a fully implicit-implicit, coupled mechanical deformation and porous flow, finite element software tool. The fully implicit software accurately predicts classical fundamental analytical solutions such as the Terzaghi consolidation problem. Furthermore, it can capture other analytical solutions less well known in the literature, such as Gibson’s sedimentation rate problem and Coussy’s problems investigating wellbore stability for poroelastic rocks. The mechanical volume strains are transferred to the porous flow governing equation in an implicit framework. This will overcome some of the many current industrial issues, which use explicit solvers for the mechanical governing equations and only implicit solvers on the porous flow side. This can potentially lead to instability and non-convergence issues in the coupled system, plus giving results with an accountable degree of error. The specification of a fully monolithic implicit-implicit coupled porous media code sees the solution of both seepage-mechanical equations in one matrix system, under a unified time-stepping scheme, which makes the problem definition much easier. When using an explicit solver, additional input such as the damping coefficient and mass scaling factor is required, which are circumvented with a fully implicit solution. Further, improved accuracy is achieved as the solution is not dependent on predictor-corrector methods for the pore fluid pressure solution, but at the potential cost of reduced stability. In testing of this fully monolithic porous media code, there is the comparison of the fully implicit coupled scheme against an existing staggered explicit-implicit coupled scheme solution across a range of geotechnical problems. These cases include 1) Biot coefficient calculation, 2) consolidation theory with Terzaghi analytical solution, 3) sedimentation theory with Gibson analytical solution, and 4) Coussy well-bore poroelastic analytical solutions.

Keywords: coupled, implicit, monolithic, porous media

Procedia PDF Downloads 101
6998 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 28
6997 A Multi-Family Offline SPE LC-MS/MS Analytical Method for Anionic, Cationic and Non-ionic Surfactants in Surface Water

Authors: Laure Wiest, Barbara Giroud, Azziz Assoumani, Francois Lestremau, Emmanuelle Vulliet

Abstract:

Due to their production at high tonnages and their extensive use, surfactants are contaminants among those determined at the highest concentrations in wastewater. However, analytical methods and data regarding their occurrence in river water are scarce and concern only a few families, mainly anionic surfactants. The objective of this study was to develop an analytical method to extract and analyze a wide variety of surfactants in a minimum of steps, with a sensitivity compatible with the detection of ultra-traces in surface waters. 27 substances, from 12 families of surfactants, anionic, cationic and non-ionic were selected for method optimization. Different retention mechanisms for the extraction by solid phase extraction (SPE) were tested and compared in order to improve their detection by liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). The best results were finally obtained with a C18 grafted silica LC column and a polymer cartridge with hydrophilic lipophilic balance (HLB), and the method developed allows the extraction of the three types of surfactants with satisfactory recoveries. The final analytical method comprised only one extraction and two LC injections. It was validated and applied for the quantification of surfactants in 36 river samples. The method's limits of quantification (LQ), intra- and inter-day precision and accuracy were evaluated, and good performances were obtained for the 27 substances. As these compounds have many areas of application, contaminations of instrument and method blanks were observed and considered for the determination of LQ. Nevertheless, with LQ between 15 and 485 ng/L, and accuracy of over 80%, this method was suitable for monitoring surfactants in surface waters. Application on French river samples revealed the presence of anionic, cationic and non-ionic surfactants with median concentrations ranging from 24 ng/L for octylphenol ethoxylates (OPEO) to 4.6 µg/L for linear alkylbenzenesulfonates (LAS). The analytical method developed in this work will therefore be useful for future monitoring of surfactants in waters. Moreover, this method, which shows good performances for anionic, non-ionic and cationic surfactants, may be easily adapted to other surfactants.

Keywords: anionic surfactant, cationic surfactant, LC-MS/MS, non-ionic surfactant, SPE, surface water

Procedia PDF Downloads 107
6996 Miniaturizing the Volumetric Titration of Free Nitric Acid in U(vi) Solutions: On the Lookout for a More Sustainable Process Radioanalytical Chemistry through Titration-On-A-Chip

Authors: Jose Neri, Fabrice Canto, Alastair Magnaldo, Laurent Guillerme, Vincent Dugas

Abstract:

A miniaturized and automated approach for the volumetric titration of free nitric acid in U(VI) solutions is presented. Free acidity measurement refers to the acidity quantification in solutions containing hydrolysable heavy metal ions such as U(VI), U(IV) or Pu(IV) without taking into account the acidity contribution from the hydrolysis of such metal ions. It is, in fact, an operation having an essential role for the control of the nuclear fuel recycling process. The main objective behind the technical optimization of the actual ‘beaker’ method was to reduce the amount of radioactive substance to be handled by the laboratory personnel, to ease the instrumentation adjustability within a glove-box environment and to allow a high-throughput analysis for conducting more cost-effective operations. The measurement technique is based on the concept of the Taylor-Aris dispersion in order to create inside of a 200 μm x 5cm circular cylindrical micro-channel a linear concentration gradient in less than a second. The proposed analytical methodology relies on the actinide complexation using pH 5.6 sodium oxalate solution and subsequent alkalimetric titration of nitric acid with sodium hydroxide. The titration process is followed with a CCD camera for fluorescence detection; the neutralization boundary can be visualized in a detection range of 500nm- 600nm thanks to the addition of a pH sensitive fluorophore. The operating principle of the developed device allows the active generation of linear concentration gradients using a single cylindrical micro channel. This feature simplifies the fabrication and ease of use of the micro device, as it does not need a complex micro channel network or passive mixers to generate the chemical gradient. Moreover, since the linear gradient is determined by the liquid reagents input pressure, its generation can be fully achieved in faster intervals than one second, being a more timely-efficient gradient generation process compared to other source-sink passive diffusion devices. The resulting linear gradient generator device was therefore adapted to perform for the first time, a volumetric titration on a chip where the amount of reagents used is fixed to the total volume of the micro channel, avoiding an important waste generation like in other flow-based titration techniques. The associated analytical method is automated and its linearity has been proven for the free acidity determination of U(VI) samples containing up to 0.5M of actinide ion and nitric acid in a concentration range of 0.5M to 3M. In addition to automation, the developed analytical methodology and technique greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing a thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight-fold. The developed device represents, therefore, a great step towards an easy-to-handle nuclear-related application, which in the short term could be used to improve laboratory safety as much as to reduce the environmental impact of the radioanalytical chain.

Keywords: free acidity, lab-on-a-chip, linear concentration gradient, Taylor-Aris dispersion, volumetric titration

Procedia PDF Downloads 362
6995 Uneven Habitat Characterisation by Using Geo-Gebra Software in the Lacewings (Insecta: Neuroptera), Knowing When to Calculate the Habitat: Creating More Informative Ecological Experiments

Authors: Hakan Bozdoğan

Abstract:

A wide variety of traditional methodologies has been enhanced for characterising smooth habitats in order to find out different environmental objectives. The habitats were characterised based on size and shape by using Geo-Gebra Software. In this study, an innovative approach to researching habitat characterisation in the lacewing species, GeoGebra software is utilised. This approach is demonstrated using the example of ‘surface area’ as an analytical concept, wherein the goal was to increase clearness for researchers, and to improve the quality of researching in survey area. In conclusion, habitat characterisation using the mathematical programme provides a unique potential to collect more comprehensible and analytical information about in shapeless areas beyond the range of direct observations methods. This research contributes a new perspective for assessing the structure of habitat, providing a novel mathematical tool for the research and management of such habitats and environments. Further surveys should be undertaken at additional sites within the Amanos Mountains for a comprehensive assessment of lacewings habitat characterisation in an analytical plane. This paper is supported by Ahi Evran University Scientific Research Projects Coordination Unit, Projects No:TBY.E2.17.001 and TBY.A4.16.001.

Keywords: uneven habitat shape, habitat assessment, lacewings, Geo-Gebra Software

Procedia PDF Downloads 251
6994 Analytical Solution for Multi-Segmented Toroidal Shells under Uniform Pressure

Authors: Nosakhare Enoma, Alphose Zingoni

Abstract:

The requirements for various toroidal shell forms are increasing due to new applications, available storage space and the consideration of appearance. Because of the complexity of some of these structural forms, the finite element method is nowadays mainly used for their analysis, even for simple static studies. This paper presents an easy-to-use analytical algorithm for pressurized multi-segmented toroidal shells of revolution. The membrane solution, which acts as a particular solution of the bending-theory equations, is developed based on membrane theory of shells, and a general approach is formulated for quantifying discontinuity effects at the shell junctions using the well-known Geckeler’s approximation. On superimposing these effects, and applying the ensuing solution to the problem of the pressurized toroid with four segments, closed-form stress results are obtained for the entire toroid. A numerical example is carried out using the developed method. The analytical results obtained show excellent agreement with those from the finite element method, indicating that the proposed method can be also used for complementing and verifying FEM results, and providing insights on other related problems.

Keywords: bending theory of shells, membrane hypothesis, pressurized toroid, segmented toroidal vessel, shell analysis

Procedia PDF Downloads 288
6993 Analytical Study of Cobalt(II) and Nickel(II) Extraction with Salicylidene O-, M-, and P-Toluidine in Chloroform

Authors: Sana Almi, Djamel Barkat

Abstract:

The solvent extraction of cobalt (II) and nickel (II) from aqueous sulfate solutions were investigated with the analytical methods of slope analysis using salicylidene aniline and the three isomeric o-, m- and p-salicylidene toluidine diluted with chloroform at 25°C. By a statistical analysis of the extraction data, it was concluded that the extracted species are CoL2 with CoL2(HL) and NiL2 (HL denotes HSA, HSOT, HSMT, and HSPT). The extraction efficiency of Co(II) was higher than Ni(II). This tendency is confirmed from numerical extraction constants for each metal cations. The best extraction was according to the following order: HSMT > HSPT > HSOT > HSA for Co2+ and Ni2+.

Keywords: solvent extraction, nickel(II), cobalt(II), salicylidene aniline, o-, m-, and p-salicylidene toluidine

Procedia PDF Downloads 454
6992 Flexural Analysis of Symmetric Laminated Composite Timoshenko Beams under Harmonic Forces: An Analytical Solution

Authors: Mohammed Ali Hjaji, A.K. El-Senussi, Said H. Eshtewi

Abstract:

The flexural dynamic response of symmetric laminated composite beams subjected to general transverse harmonic forces is investigated. The dynamic equations of motion and associated boundary conditions based on the first order shear deformation are derived through the use of Hamilton’s principle. The influences of shear deformation, rotary inertia, Poisson’s ratio and fibre orientation are incorporated in the present formulation. The resulting governing flexural equations for symmetric composite Timoshenko beams are exactly solved and the closed form solutions for steady state flexural response are then obtained for cantilever and simply supported boundary conditions. The applicability of the analytical closed-form solution is demonstrated via several examples with various transverse harmonic loads and symmetric cross-ply and angle-ply laminates. Results based on the present solution are assessed and validated against other well established finite element solutions and exact solutions available in the literature.

Keywords: analytical solution, flexural response, harmonic forces, symmetric laminated beams, steady state response

Procedia PDF Downloads 460
6991 Screening Methodology for Seismic Risk Assessment of Aging Structures in Oil and Gas Plants

Authors: Mohammad Nazri Mustafa, Pedram Hatami Abdullah, M. Fakhrur Razi Ahmad Faizul

Abstract:

With the issuance of Malaysian National Annex 2017 as a part of MS EN 1998-1:2015, the seismic mapping of Malaysian Peninsular including Sabah and Sarawak has undergone some changes in terms of the Peak Ground Acceleration (PGA) value. The revision to the PGA has raised a concern on the safety of oil and gas onshore structures as these structures were not designed to accommodate the new PGA values which are much higher than the previous values used in the original design. In view of the high numbers of structures and buildings to be re-assessed, a risk assessment methodology has been developed to prioritize and rank the assets in terms of their criticality against the new seismic loading. To-date such risk assessment method for oil and gas onshore structures is lacking, and it is the main intention of this technical paper to share the risk assessment methodology and risk elements scoring finalized via Delphi Method. The finalized methodology and the values used to rank the risk elements have been established based on years of relevant experience on the subject matter and based on a series of rigorous discussions with professionals in the industry. The risk scoring is mapped against the risk matrix (i.e., the LOF versus COF) and hence, the overall risk for the assets can be obtained. The overall risk can be used to prioritize and optimize integrity assessment, repair and strengthening work against the new seismic mapping of the country.

Keywords: methodology, PGA, risk, seismic

Procedia PDF Downloads 122
6990 Strategic Tools for Entrepreneurship: Model Proposal for Manufacturing Companies

Authors: Chiara Mansanta, Daniela Sani

Abstract:

The present paper presents the further development of the application of a standard methodology to boost innovation inside real case studies of manufacturing companies. The proposed methodology provides a viable solution for manufacturing companies that have to evaluate new business ideas. The study underlined the concept of entrepreneurship and how a manager can use it to promote innovation inside their companies. Starting from a literature study on entrepreneurship, this paper examines the role of the manager in supporting a company’s development. The empirical part of the study is based on two manufacturing companies that used the proposed methodology to favour entrepreneurship through an alternative approach. The research demonstrated the need for companies to have a structured and well-defined methodology to achieve their goals. The purpose of this article is to understand the significance of business models inside companies and explore how they affect business strategy and innovation management. The idea is to use business models to support entrepreneurs in their decision-making processes, reducing risks and avoiding errors.

Keywords: entrepreneurship, manufacturing companies, solution validation, strategic management

Procedia PDF Downloads 60
6989 Establishment of the Regression Uncertainty of the Critical Heat Flux Power Correlation for an Advanced Fuel Bundle

Authors: L. Q. Yuan, J. Yang, A. Siddiqui

Abstract:

A new regression uncertainty analysis methodology was applied to determine the uncertainties of the critical heat flux (CHF) power correlation for an advanced 43-element bundle design, which was developed by Canadian Nuclear Laboratories (CNL) to achieve improved economics, resource utilization and energy sustainability. The new methodology is considered more appropriate than the traditional methodology in the assessment of the experimental uncertainty associated with regressions. The methodology was first assessed using both the Monte Carlo Method (MCM) and the Taylor Series Method (TSM) for a simple linear regression model, and then extended successfully to a non-linear CHF power regression model (CHF power as a function of inlet temperature, outlet pressure and mass flow rate). The regression uncertainty assessed by MCM agrees well with that by TSM. An equation to evaluate the CHF power regression uncertainty was developed and expressed as a function of independent variables that determine the CHF power.

Keywords: CHF experiment, CHF correlation, regression uncertainty, Monte Carlo Method, Taylor Series Method

Procedia PDF Downloads 388
6988 Smart Demand Response: A South African Pragmatic, Non-Destructive and Alternative Advanced Metering Infrastructure-Based Maximum Demand Reduction Methodology

Authors: Christo Nicholls

Abstract:

The National Electricity Grid (NEG) in South Africa has been under strain for the last five years. This overburden of the NEG led Eskom (the State-Owned Entity responsible for the NEG) to implement a blunt methodology to assist them in reducing the maximum demand (MD) on the NEG, when required, called Loadshedding. The challenge of this methodology is that not only does it lead to immense technical issues with the distribution network equipment, e.g., transformers, due to the frequent abrupt off and on switching, it also has a broader negative fiscal impact on the distributors, as their key consumers (commercial & industrial) are now grid defecting due to the lack of Electricity Security Provision (ESP). This paper provides a pragmatic alternative methodology utilizing specific functionalities embedded within direct-connect single and three-phase Advanced Meter Infrastructure (AMI) Solutions deployed within the distribution network, in conjunction with a Multi-Agent Systems Based AI implementation focused on Automated Negotiation Peer-2-Peer trading. The results of this research clearly illustrate, not only does methodology provide a factual percentage contribution towards the NEG MD at the point of consideration, it also allows the distributor to leverage the real-time MD data from key consumers to activate complex, yet impact-measurable Demand Response (DR) programs.

Keywords: AI, AMI, demand response, multi-agent

Procedia PDF Downloads 85
6987 Colloquialism in Audiovisual Translation: English Subtitling of the Lebanese Film Capernaum as a Case Study

Authors: Fatima Saab

Abstract:

This paper attempts to study colloquialism in audio-visual translation, with particular emphasis given to investigating the difficulties and challenges encountered by subtitlers in translating Lebanese colloquial into English. To achieve the main objectives of this study, ample and thorough cultural and translational analysis of examples drawn from the subtitled movie Capernaum are presented in order to identify the strategies used to overcome cultural barriers and differences and to show the process of decision-making by the translator. Also, special attention is given to explain the technicalities in translating subtitles and how they affect the translation process. The research is a descriptive analytical study whereby the writer sets out empirical observations, consisting of descriptive and analytical examination of the difficulties and problems associated with translating Arabic colloquialisms, specifically Lebanese, into English in the subtitled film, Capernaum. The research methodology utilizes a qualitative approach to group the selected data into the subtitling strategies presented by Gottlieb under the domesticating or foreignizing strategies according to Venuti's Model. It is shown that producing the same meanings to a foreign audience is not an easy task. The background of cultural elements and the stories that make up the history and mindset of the Lebanese and Arabic peoples leads to the use of the transfer and paraphrase methodologies most of the time (81% of the sample used for analysis). The research shows that translating and subtitling colloquialism needs special skills by the translators to overcome the challenges imposed by the limited presentation space as well as cultural differences. Translation of colloquial Arabic/Lebanese can be achieved to a certain extent and delivering the meaning and effect of the source language culture is accomplished in as much as the translator investigates and relates to the target culture.

Keywords: Lebanese colloquial, audio-visual translation, subtitling, Capernaum

Procedia PDF Downloads 120
6986 Analytical Solutions for Geodesic Acoustic Eigenmodes in Tokamak Plasmas

Authors: Victor I. Ilgisonis, Ludmila V. Konovaltseva, Vladimir P. Lakhin, Ekaterina A. Sorokina

Abstract:

The analytical solutions for geodesic acoustic eigenmodes in tokamak plasmas with circular concentric magnetic surfaces are found. In the frame of ideal magnetohydrodynamics the dispersion relation taking into account the toroidal coupling between electrostatic perturbations and electromagnetic perturbations with poloidal mode number |m| = 2 is derived. In the absence of such a coupling the dispersion relation gives the standard continuous spectrum of geodesic acoustic modes. The analysis of the existence of global eigenmodes for plasma equilibria with both off-axis and on-axis maximum of the local geodesic acoustic frequency is performed.

Keywords: tokamak, MHD, geodesic acoustic mode, eigenmode

Procedia PDF Downloads 705
6985 Analysis of Effects of Magnetic Slot Wedges on Characteristics of Permanent Magnet Synchronous Machine

Authors: B. Ladghem Chikouche

Abstract:

The influence of slot wedges permeability on the electromagnetic performance of three-phase permanent magnet synchronous machine is investigated in this paper. It is shown that the back-EMF waveform, electromagnetic torque and electromagnetic torque ripple are all significantly affected by slot wedges permeability. The paper presents an accurate analytical subdomain model and confirmed by finite-element analyses.

Keywords: exact analytical calculation, finite-element method, magnetic field distribution, permanent magnet machines performance, stator slot wedges permeability

Procedia PDF Downloads 301
6984 A Prediction Model Using the Price Cyclicality Function Optimized for Algorithmic Trading in Financial Market

Authors: Cristian Păuna

Abstract:

After the widespread release of electronic trading, automated trading systems have become a significant part of the business intelligence system of any modern financial investment company. An important part of the trades is made completely automatically today by computers using mathematical algorithms. The trading decisions are taken almost instantly by logical models and the orders are sent by low-latency automatic systems. This paper will present a real-time price prediction methodology designed especially for algorithmic trading. Based on the price cyclicality function, the methodology revealed will generate price cyclicality bands to predict the optimal levels for the entries and exits. In order to automate the trading decisions, the cyclicality bands will generate automated trading signals. We have found that the model can be used with good results to predict the changes in market behavior. Using these predictions, the model can automatically adapt the trading signals in real-time to maximize the trading results. The paper will reveal the methodology to optimize and implement this model in automated trading systems. After tests, it is proved that this methodology can be applied with good efficiency in different timeframes. Real trading results will be also displayed and analyzed in order to qualify the methodology and to compare it with other models. As a conclusion, it was found that the price prediction model using the price cyclicality function is a reliable trading methodology for algorithmic trading in the financial market.

Keywords: algorithmic trading, automated trading systems, financial markets, high-frequency trading, price prediction

Procedia PDF Downloads 151
6983 Generalization of Clustering Coefficient on Lattice Networks Applied to Criminal Networks

Authors: Christian H. Sanabria-Montaña, Rodrigo Huerta-Quintanilla

Abstract:

A lattice network is a special type of network in which all nodes have the same number of links, and its boundary conditions are periodic. The most basic lattice network is the ring, a one-dimensional network with periodic border conditions. In contrast, the Cartesian product of d rings forms a d-dimensional lattice network. An analytical expression currently exists for the clustering coefficient in this type of network, but the theoretical value is valid only up to certain connectivity value; in other words, the analytical expression is incomplete. Here we obtain analytically the clustering coefficient expression in d-dimensional lattice networks for any link density. Our analytical results show that the clustering coefficient for a lattice network with density of links that tend to 1, leads to the value of the clustering coefficient of a fully connected network. We developed a model on criminology in which the generalized clustering coefficient expression is applied. The model states that delinquents learn the know-how of crime business by sharing knowledge, directly or indirectly, with their friends of the gang. This generalization shed light on the network properties, which is important to develop new models in different fields where network structure plays an important role in the system dynamic, such as criminology, evolutionary game theory, econophysics, among others.

Keywords: clustering coefficient, criminology, generalized, regular network d-dimensional

Procedia PDF Downloads 377
6982 Using Soft Systems Methodology in the Healthcare Industry of Mauritius

Authors: Arun Kumar, Neelesh Haulder

Abstract:

This paper identifies and resolves some key issues relating to a specific aspect within the supply chain logistics of the public health care industry in the Republic of Mauritius. The analysis and the proposed solution are performed using soft systems methodology (SSM). Through the application of this relevant systematic approach at problem solving, the aim is to obtain an in-depth analysis of the problem, incorporating every possible world view of the problem and consequently to obtain a well explored solution aimed at implementing relevant changes within the current supply chain logistics of the health care industry, with the purpose of tackling the key identified issues.

Keywords: soft systems methodology, CATWOE, healthcare, logistics

Procedia PDF Downloads 485
6981 Pricing the Risk Associated to Weather of Variable Renewable Energy Generation

Authors: Jorge M. Uribe

Abstract:

We propose a methodology for setting the price of an insurance contract targeted to manage the risk associated with weather conditions that affect variable renewable energy generation. The methodology relies on conditional quantile regressions to estimate the weather risk of a solar panel. It is illustrated using real daily radiation and weather data for three cities in Spain (Valencia, Barcelona and Madrid) from February 2/2004 to January 22/2019. We also adapt the concepts of value at risk and expected short fall from finance to this context, to provide a complete panorama of what we label as weather risk. The methodology is easy to implement and can be used by insurance companies to price a contract with the aforementioned characteristics when data about similar projects and accurate cash flow projections are lacking. Our methodology assigns a higher price to an insurance product with the stated characteristics in Madrid, compared to Valencia and Barcelona. This is consistent with Madrid showing the largest interquartile range of operational deficits and it is unrelated to the average value deficit, which illustrates the importance of our proposal.

Keywords: insurance, weather, vre, risk

Procedia PDF Downloads 120
6980 Experimental and Analytical Investigation of Seismic Behavior of Concrete Beam-Column Joints Strengthened by Fiber-Reinforced Polymers Jacketing

Authors: Ebrahim Zamani Beydokhti, Hashem Shariatmadar

Abstract:

This paper presents an experimental and analytical investigation on the behavior of retrofitted beam-column joints subjected to reversed cyclic loading. The experimental program comprises 8 external beam–column joint connection subassemblages tested in 2 phases; one was the damaging phase and second was the repairing phase. The beam-column joints were no seismically designed, i.e. the joint, beam and column critical zones had no special transverse stirrups. The joins were tested under cyclic loading in previous research. The experiment had two phases named damage phase and retrofit phase. Then the experimental results compared with analytical results achieved from modeling in OpenSees software. The presence of lateral slab and the axial load amount were analytically investigated. The results showed that increasing the axial load and presence of lateral slab increased the joint capacity. The presence of lateral slab increased the dissipated energy, while the axial load had no significant effect on it.

Keywords: concrete beam-column joints, CFRP sheets, lateral slab, axial load

Procedia PDF Downloads 116
6979 Modeling of Austenitic Stainless Steel during Face Milling Using Response Surface Methodology

Authors: A. A. Selaimia, H. Bensouilah, M. A. Yallese, I. Meddour, S. Belhadi, T. Mabrouki

Abstract:

The objective of this work is to model the output responses namely; surface roughness (Ra), cutting force (Fc), during the face milling of the austenitic stainless steel X2CrNi18-9 with coated carbide tools (GC4040). For raison, response surface methodology (RMS) is used to determine the influence of each technological parameter. A full factorial design (L27) is chosen for the experiments, and the ANOVA is used in order to evaluate the influence of the technological cutting parameters namely; cutting speed (Vc), feed per tooth, and depth of cut (ap) on the out-put responses. The results reveal that (Ra) is mostly influenced by (fz) and (Fc) is found considerably affected by (ap).

Keywords: austenitic stainless steel, ANOVA, coated carbide, response surface methodology (RSM)

Procedia PDF Downloads 342
6978 Step Method for Solving Nonlinear Two Delays Differential Equation in Parkinson’s Disease

Authors: H. N. Agiza, M. A. Sohaly, M. A. Elfouly

Abstract:

Parkinson's disease (PD) is a heterogeneous disorder with common age of onset, symptoms, and progression levels. In this paper we will solve analytically the PD model as a non-linear delay differential equation using the steps method. The step method transforms a system of delay differential equations (DDEs) into systems of ordinary differential equations (ODEs). On some numerical examples, the analytical solution will be difficult. So we will approximate the analytical solution using Picard method and Taylor method to ODEs.

Keywords: Parkinson's disease, step method, delay differential equation, two delays

Procedia PDF Downloads 176
6977 Effects of Magnetization Patterns on Characteristics of Permanent Magnet Linear Synchronous Generator for Wave Energy Converter Applications

Authors: Sung-Won Seo, Jang-Young Choi

Abstract:

The rare earth magnets used in synchronous generators offer many advantages, including high efficiency, greatly reduced the size, and weight. The permanent magnet linear synchronous generator (PMLSG) allows for direct drive without the need for a mechanical device. Therefore, the PMLSG is well suited to translational applications, such as wave energy converters and free piston energy converters. This manuscript compares the effects of different magnetization patterns on the characteristics of double-sided PMLSGs in slotless stator structures. The Halbach array has a higher flux density in air-gap than the Vertical array, and the advantages of its performance and efficiency are widely known. To verify the advantage of Halbach array, we apply a finite element method (FEM) and analytical method. In general, a FEM and an analytical method are used in the electromagnetic analysis for determining model characteristics, and the FEM is preferable to magnetic field analysis. However, the FEM is often slow and inflexible. On the other hand, the analytical method requires little time and produces accurate analysis of the magnetic field. Therefore, the flux density in air-gap and the Back-EMF can be obtained by FEM. In addition, the results from the analytical method correspond well with the FEM results. The model of the Halbach array reveals less copper loss than the model of the Vertical array, because of the Halbach array’s high output power density. The model of the Vertical array is lower core loss than the model of Halbach array, because of the lower flux density in air-gap. Therefore, the current density in the Vertical model is higher for identical power output. The completed manuscript will include the magnetic field characteristics and structural features of both models, comparing various results, and specific comparative analysis will be presented for the determination of the best model for application in a wave energy converting system.

Keywords: wave energy converter, permanent magnet linear synchronous generator, finite element method, analytical method

Procedia PDF Downloads 270
6976 Detection of Flood Prone Areas Using Multi Criteria Evaluation, Geographical Information Systems and Fuzzy Logic. The Ardas Basin Case

Authors: Vasileiou Apostolos, Theodosiou Chrysa, Tsitroulis Ioannis, Maris Fotios

Abstract:

The severity of extreme phenomena is due to their ability to cause severe damage in a small amount of time. It has been observed that floods affect the greatest number of people and induce the biggest damage when compared to the total of annual natural disasters. The detection of potential flood-prone areas constitutes one of the fundamental components of the European Natural Disaster Management Policy, directly connected to the European Directive 2007/60. The aim of the present paper is to develop a new methodology that combines geographical information, fuzzy logic and multi-criteria evaluation methods so that the most vulnerable areas are defined. Therefore, ten factors related to geophysical, morphological, climatological/meteorological and hydrological characteristics of the basin were selected. Afterwards, two models were created to detect the areas pronest to flooding. The first model defined the gravitas of each factor using Analytical Hierarchy Process (AHP) and the final map of possible flood spots were created using GIS and Boolean Algebra. The second model made use of the fuzzy logic and GIS combination and a respective map was created. The application area of the aforementioned methodologies was in Ardas basin due to the frequent and important floods that have taken place these last years. Then, the results were compared to the already observed floods. The result analysis shows that both models can detect with great precision possible flood spots. As the fuzzy logic model is less time-consuming, it is considered the ideal model to apply to other areas. The said results are capable of contributing to the delineation of high risk areas and to the creation of successful management plans dealing with floods.

Keywords: analytical hierarchy process, flood prone areas, fuzzy logic, geographic information system

Procedia PDF Downloads 346
6975 Principal Component Analysis in Drug-Excipient Interactions

Authors: Farzad Khajavi

Abstract:

Studies about the interaction between active pharmaceutical ingredients (API) and excipients are so important in the pre-formulation stage of development of all dosage forms. Analytical techniques such as differential scanning calorimetry (DSC), Thermal gravimetry (TG), and Furrier transform infrared spectroscopy (FTIR) are commonly used tools for investigating regarding compatibility and incompatibility of APIs with excipients. Sometimes the interpretation of data obtained from these techniques is difficult because of severe overlapping of API spectrum with excipients in their mixtures. Principal component analysis (PCA) as a powerful factor analytical method is used in these situations to resolve data matrices acquired from these analytical techniques. Binary mixtures of API and interested excipients are considered and produced. Peaks of FTIR, DSC, or TG of pure API and excipient and their mixtures at different mole ratios will construct the rows of the data matrix. By applying PCA on the data matrix, the number of principal components (PCs) is determined so that it contains the total variance of the data matrix. By plotting PCs or factors obtained from the score of the matrix in two-dimensional spaces if the pure API and its mixture with the excipient at the high amount of API and the 1:1mixture form a separate cluster and the other cluster comprise of the pure excipient and its blend with the API at the high amount of excipient. This confirms the existence of compatibility between API and the interested excipient. Otherwise, the incompatibility will overcome a mixture of API and excipient.

Keywords: API, compatibility, DSC, TG, interactions

Procedia PDF Downloads 94