Search results for: equivalent linear approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17266

Search results for: equivalent linear approach

16726 Analysis of Energy Flows as An Approach for The Formation of Monitoring System in the Sustainable Regional Development

Authors: Inese Trusina, Elita Jermolajeva

Abstract:

Global challenges require a transition from the existing linear economic model to a model that will consider nature as a life support system for the developmenton the way to social well-being in the frame of the ecological economics paradigm. The article presentsbasic definitions for the development of formalized description of sustainabledevelopment monitoring. It provides examples of calculating the parameters of monitoring for the Baltic Sea region countries and their primary interpretation.

Keywords: sustainability, development, power, ecological economics, regional economic, monitoring

Procedia PDF Downloads 120
16725 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 387
16724 Using the Simple Fixed Rate Approach to Solve Economic Lot Scheduling Problem under the Basic Period Approach

Authors: Yu-Jen Chang, Yun Chen, Hei-Lam Wong

Abstract:

The Economic Lot Scheduling Problem (ELSP) is a valuable mathematical model that can support decision-makers to make scheduling decisions. The basic period approach is effective for solving the ELSP. The assumption for applying the basic period approach is that a product must use its maximum production rate to be produced. However, a product can lower its production rate to reduce the average total cost when a facility has extra idle time. The past researches discussed how a product adjusts its production rate under the common cycle approach. To the best of our knowledge, no studies have addressed how a product lowers its production rate under the basic period approach. This research is the first paper to discuss this topic. The research develops a simple fixed rate approach that adjusts the production rate of a product under the basic period approach to solve the ELSP. Our numerical example shows our approach can find a better solution than the traditional basic period approach. Our mathematical model that applies the fixed rate approach under the basic period approach can serve as a reference for other related researches.

Keywords: economic lot, basic period, genetic algorithm, fixed rate

Procedia PDF Downloads 563
16723 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model

Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong

Abstract:

This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.

Keywords: defective autoparts products, Bayesian framework, generalized linear mixed model (GLMM), risk factors

Procedia PDF Downloads 570
16722 Development of Graph-Theoretic Model for Ranking Top of Rail Lubricants

Authors: Subhash Chandra Sharma, Mohammad Soleimani

Abstract:

Selection of the correct lubricant for the top of rail application is a complex process. In this paper, the selection of the proper lubricant for a Top-Of-Rail (TOR) lubrication system based on graph theory and matrix approach has been developed. Attributes influencing the selection process and their influence on each other has been represented through a digraph and an equivalent matrix. A matrix function which is called the Permanent Function is derived. By substituting the level of inherent contribution of the influencing parameters and their influence on each other qualitatively, a criterion called Suitability Index is derived. Based on these indices, lubricants can be ranked for their suitability. The proposed model can be useful for maintenance engineers in selecting the best lubricant for a TOR application. The proposed methodology is illustrated step–by-step through an example.

Keywords: lubricant selection, top of rail lubrication, graph-theory, Ranking of lubricants

Procedia PDF Downloads 295
16721 An Examination of Changes on Natural Vegetation due to Charcoal Production Using Multi Temporal Land SAT Data

Authors: T. Garba, Y. Y. Babanyara, M. Isah, A. K. Muktari, R. Y. Abdullahi

Abstract:

The increased in demand of fuel wood for heating, cooking and sometimes bakery has continued to exert appreciable impact on natural vegetation. This study focus on the use of multi-temporal data from land sat TM of 1986, land sat EMT of 1999 and lands sat ETM of 2006 to investigate the changes of Natural Vegetation resulting from charcoal production activities. The three images were classified based on bare soil, built up areas, cultivated land, and natural vegetation, Rock out crop and water bodies. From the classified images Land sat TM of 1986 it shows natural vegetation of the study area to be 308,941.48 hectares equivalent to 50% of the area it then reduces to 278,061.21 which is 42.92% in 1999 it again depreciated to 199,647.81 in 2006 equivalent to 30.83% of the area. Consequently cultivated continue increasing from 259,346.80 hectares (42%) in 1986 to 312,966.27 hectares (48.3%) in 1999 and then to 341.719.92 hectares (52.78%). These show that within the span of 20 years (1986 to 2006) the natural vegetation is depreciated by 119,293.81 hectares. This implies that if the menace is not control the natural might likely be lost in another twenty years. This is because forest cleared for charcoal production is normally converted to farmland. The study therefore concluded that there is the need for alternatives source of domestic energy such as the use of biomass which can easily be accessible and affordable to people. In addition, the study recommended that there should be strong policies enforcement for the protection forest reserved.

Keywords: charcoal, classification, data, images, land use, natural vegetation

Procedia PDF Downloads 365
16720 Taylor’s Law and Relationship between Life Expectancy at Birth and Variance in Age at Death in Period Life Table

Authors: David A. Swanson, Lucky M. Tedrow

Abstract:

Taylor’s Law is a widely observed empirical pattern that relates variances to means in sets of non-negative measurements via an approximate power function, which has found application to human mortality. This study adds to this research by showing that Taylor’s Law leads to a model that reasonably describes the relationship between life expectancy at birth (e0, which also is equal to mean age at death in a life table) and variance at age of death in seven World Bank regional life tables measured at two points in time, 1970 and 2000. Using as a benchmark a non-random sample of four Japanese female life tables covering the period from 1950 to 2004, the study finds that the simple linear model provides reasonably accurate estimates of variance in age at death in a life table from e0, where the latter range from 60.9 to 85.59 years. Employing 2017 life tables from the Human Mortality Database, the simple linear model is used to provide estimates of variance at age in death for six countries, three of which have high e0 values and three of which have lower e0 values. The paper provides a substantive interpretation of Taylor’s Law relative to e0 and concludes by arguing that reasonably accurate estimates of variance in age at death in a period life table can be calculated using this approach, which also can be used where e0 itself is estimated rather than generated through the construction of a life table, a useful feature of the model.

Keywords: empirical pattern, mean age at death in a life table, mean age of a stationary population, stationary population

Procedia PDF Downloads 330
16719 Reallocation of Bed Capacity in a Hospital Combining Discrete Event Simulation and Integer Linear Programming

Authors: Muhammed Ordu, Eren Demir, Chris Tofallis

Abstract:

The number of inpatient admissions in the UK has been significantly increasing over the past decade. These increases cause bed occupancy rates to exceed the target level (85%) set by the Department of Health in England. Therefore, hospital service managers are struggling to better manage key resource such as beds. On the other hand, this severe demand pressure might lead to confusion in wards. For example, patients can be admitted to the ward of another inpatient specialty due to lack of resources (i.e., bed). This study aims to develop a simulation-optimization model to reallocate the available number of beds in a mid-sized hospital in the UK. A hospital simulation model was developed to capture the stochastic behaviours of the hospital by taking into account the accident and emergency department, all outpatient and inpatient services, and the interactions between each other. A couple of outputs of the simulation model (e.g., average length of stay and revenue) were generated as inputs to be used in the optimization model. An integer linear programming was developed under a number of constraints (financial, demand, target level of bed occupancy rate and staffing level) with the aims of maximizing number of admitted patients. In addition, a sensitivity analysis was carried out by taking into account unexpected increases on inpatient demand over the next 12 months. As a result, the major findings of the approach proposed in this study optimally reallocate the available number of beds for each inpatient speciality and reveal that 74 beds are idle. In addition, the findings of the study indicate that the hospital wards will be able to cope with 14% demand increase at most in the projected year. In conclusion, this paper sheds a new light on how best to reallocate beds in order to cope with current and future demand for healthcare services.

Keywords: bed occupancy rate, bed reallocation, discrete event simulation, inpatient admissions, integer linear programming, projected usage

Procedia PDF Downloads 144
16718 Risk Assessment of Radiation Hazard for a Typical WWER1000: Cancer Risk Analysis during a Hypothetical Accident

Authors: R. Gharari, N. Kojouri, R. Hosseini Aghdam, E. Alibeigi, B. Salmasian

Abstract:

In this research, the WWER1000/V446 (a PWR Russian type reactor) is chosen as the case study. It is assumed that radioactive materials that release into the environment are more than allowable limit due to a complete failure of the ventilation system (reactor stack). In the following, the HOTSPOT and the RASCAL computational codes have been used and coupled with a developed program using MATLAB software to evaluate Total effective dose equivalent (TEDE) and cancer risk according to the BEIR equations for various human organs. In addition, effects of the containment spray system and climate conditions on the TEDE have been investigated. According to the obtained results, there is an inverse correlation between the received dose and the wind speed; the amount of the TEDE for wind speed 2 m/s and is more than wind speed for 14 m/s during the class A of the climate (2.168 and 0.444 mSv, respectively). Also, containment spray system can effect and reduce the amount of the fission products and TEDE. Furthermore, the probability of the cancer risk for women is more than men, and for children is more than adults. In addition, a specific emergency zonal planning is proposed. Results are promising in which the site selection of the WWER1000/V446 were considered safe for the public in this situation.

Keywords: TEDE, total effective dose equivalent, RASCAL and HOTSPOT codes, BEIR equations, cancer risk

Procedia PDF Downloads 164
16717 3D Linear and Cyclic Homo-Peptide Crystals Forged by Supramolecular Swelling Self-Assembly

Authors: Wenliang Song, Yu Zhang, Hua Jin, Il Kim

Abstract:

The self-assembly of the polypeptide (PP) into well-defined structures at different length scales is both biomimetic relevant and fundamentally interesting. Although there are various reports of nanostructures fabricated by the self-assembly of various PPs, directed self-assembly of PP into three-dimensional (3D) hierarchical structure has proven to be difficult, despite their importance for biological applications. Herein, an efficient method has been developed through living polymerization of phenylalanine N-Carboxy anhydride (NCA) towards the linear and cyclic polyphenylalanine, and the new invented swelling methodology can form diverse hierarchical polypeptide crystals. The solvent-dependent self-assembly behaviors of these homopolymers were characterized by high-resolution imaging tools such as atomic force microscopy, transmission electron microscopy, scanning electron microscope. The linear and cyclic polypeptide formed 3D nano hierarchical shapes, such as a sphere, cubic, stratiform and hexagonal star in different solvents. Notably, a crystalline packing model was proposed to explain the formation of 3D nanostructures based on the various diffraction patterns, looking forward to give an insight for their dissimilar shape inflection during the self-assembly process.

Keywords: self-assembly, polypeptide, bio-polymer, crystalline polymer

Procedia PDF Downloads 240
16716 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.

Keywords: Lagrange interpolation, linear complexity, monomial matrix, Newton interpolation

Procedia PDF Downloads 234
16715 Q-Efficient Solutions of Vector Optimization via Algebraic Concepts

Authors: Elham Kiyani

Abstract:

In this paper, we first introduce the concept of Q-efficient solutions in a real linear space not necessarily endowed with a topology, where Q is some nonempty (not necessarily convex) set. We also used the scalarization technique including the Gerstewitz function generated by a nonconvex set to characterize these Q-efficient solutions. The algebraic concepts of interior and closure are useful to study optimization problems without topology. Studying nonconvex vector optimization is valuable since topological interior is equal to algebraic interior for a convex cone. So, we use the algebraic concepts of interior and closure to define Q-weak efficient solutions and Q-Henig proper efficient solutions of set-valued optimization problems, where Q is not a convex cone. Optimization problems with set-valued maps have a wide range of applications, so it is expected that there will be a useful analytical tool in optimization theory for set-valued maps. These kind of optimization problems are closely related to stochastic programming, control theory, and economic theory. The paper focus on nonconvex problems, the results are obtained by assuming generalized non-convexity assumptions on the data of the problem. In convex problems, main mathematical tools are convex separation theorems, alternative theorems, and algebraic counterparts of some usual topological concepts, while in nonconvex problems, we need a nonconvex separation function. Thus, we consider the Gerstewitz function generated by a general set in a real linear space and re-examine its properties in the more general setting. A useful approach for solving a vector problem is to reduce it to a scalar problem. In general, scalarization means the replacement of a vector optimization problem by a suitable scalar problem which tends to be an optimization problem with a real valued objective function. The Gerstewitz function is well known and widely used in optimization as the basis of the scalarization. The essential properties of the Gerstewitz function, which are well known in the topological framework, are studied by using algebraic counterparts rather than the topological concepts of interior and closure. Therefore, properties of the Gerstewitz function, when it takes values just in a real linear space are studied, and we use it to characterize Q-efficient solutions of vector problems whose image space is not endowed with any particular topology. Therefore, we deal with a constrained vector optimization problem in a real linear space without assuming any topology, and also Q-weak efficient and Q-proper efficient solutions in the senses of Henig are defined. Moreover, by means of the Gerstewitz function, we provide some necessary and sufficient optimality conditions for set-valued vector optimization problems.

Keywords: algebraic interior, Gerstewitz function, vector closure, vector optimization

Procedia PDF Downloads 216
16714 Approach on Conceptual Design and Dimensional Synthesis of the Linear Delta Robot for Additive Manufacturing

Authors: Efrain Rodriguez, Cristhian Riano, Alberto Alvares

Abstract:

In recent years, robots manipulators with parallel architectures are used in additive manufacturing processes – 3D printing. These robots have advantages such as speed and lightness that make them suitable to help with the efficiency and productivity of these processes. Consequently, the interest for the development of parallel robots for additive manufacturing applications has increased. This article deals with the conceptual design and dimensional synthesis of the linear delta robot for additive manufacturing. Firstly, a methodology based on structured processes for the development of products through the phases of informational design, conceptual design and detailed design is adopted: a) In the informational design phase the Mudge diagram and the QFD matrix are used to aid a set of technical requirements, to define the form, functions and features of the robot. b) In the conceptual design phase, the functional modeling of the system through of an IDEF0 diagram is performed, and the solution principles for the requirements are formulated using a morphological matrix. This phase includes the description of the mechanical, electro-electronic and computational subsystems that constitute the general architecture of the robot. c) In the detailed design phase, a digital model of the robot is drawn on CAD software. A list of commercial and manufactured parts is detailed. Tolerances and adjustments are defined for some parts of the robot structure. The necessary manufacturing processes and tools are also listed, including: milling, turning and 3D printing. Secondly, a dimensional synthesis method applied on design of the linear delta robot is presented. One of the most important key factors in the design of a parallel robot is the useful workspace, which strongly depends on the joint space, the dimensions of the mechanism bodies and the possible interferences between these bodies. The objective function is based on the verification of the kinematic model for a prescribed cylindrical workspace, considering geometric constraints that possibly lead to singularities of the mechanism. The aim is to determine the minimum dimensional parameters of the mechanism bodies for the proposed workspace. A method based on genetic algorithms was used to solve this problem. The method uses a cloud of points with the cylindrical shape of the workspace and checks the kinematic model for each of the points within the cloud. The evolution of the population (point cloud) provides the optimal parameters for the design of the delta robot. The development process of the linear delta robot with optimal dimensions for additive manufacture is presented. The dimensional synthesis enabled to design the mechanism of the delta robot in function of the prescribed workspace. Finally, the implementation of the robotic platform developed based on a linear delta robot in an additive manufacturing application using the Fused Deposition Modeling (FDM) technique is presented.

Keywords: additive manufacturing, delta parallel robot, dimensional synthesis, genetic algorithms

Procedia PDF Downloads 190
16713 Hierarchical Piecewise Linear Representation of Time Series Data

Authors: Vineetha Bettaiah, Heggere S. Ranganath

Abstract:

This paper presents a Hierarchical Piecewise Linear Approximation (HPLA) for the representation of time series data in which the time series is treated as a curve in the time-amplitude image space. The curve is partitioned into segments by choosing perceptually important points as break points. Each segment between adjacent break points is recursively partitioned into two segments at the best point or midpoint until the error between the approximating line and the original curve becomes less than a pre-specified threshold. The HPLA representation achieves dimensionality reduction while preserving prominent local features and general shape of time series. The representation permits course-fine processing at different levels of details, allows flexible definition of similarity based on mathematical measures or general time series shape, and supports time series data mining operations including query by content, clustering and classification based on whole or subsequence similarity.

Keywords: data mining, dimensionality reduction, piecewise linear representation, time series representation

Procedia PDF Downloads 275
16712 Phytochemical Study and Biological Activity of Sage (Salvia officinalis L.)

Authors: Mekhaldi Abdelkader, Bouzned Ahcen, Djibaoui Rachid, Hamoum Hakim

Abstract:

This study presents an attempt to evaluate the antioxidant and antimicrobial activity of methanolic extract and essential oils prepared from the leaves of sage (Salvia officinalis L.). The content of polyphenols in the methanolic extract of the leaves from Salvia officinalis extract was determined by spectrophoto- metrically, calculated as gallic acid and catechin equivalent. Antioxidant activity was evaluated by free radical scavenging activity using 2,2-diphenylpicryl-1-picrylhydrazyl (DPPH) assay. The plant essential oil and methanol extract were also subjected to screenings for the evaluation of their antioxidant activities using 2, 2-diphenyl-1-picrylhydrazyl (DPPH) test. While the plant essential oil showed only weak antioxidant activities, its methanol extract was considerably active in DPPH (IC50= 37.29µg/ml) test. Appreciable total phenolic content (31.25mg/g) was also detected for the plant methanol extract as gallic acid equivalent in the Folin–Ciocalteu test. The plant was also screened for its antimicrobial activity and good to moderate inhibitions were recorded for its essential oil and methanol extract against most of the tested microorganisms. The present investigation revealed that this plant has rich source of antioxidant properties. It is for this reason that sage has found increasing application in food formulations.

Keywords: antibacterial activity, antioxidant activity, flavonoid, polyphenol, salvia officinalis

Procedia PDF Downloads 409
16711 Competence-Based Human Resources Selection and Training: Making Decisions

Authors: O. Starineca, I. Voronchuk

Abstract:

Human Resources (HR) selection and training have various implementation possibilities depending on an organization’s abilities and peculiarities. We propose to base HR selection and training decisions about on a competence-based approach. HR selection and training of employees are topical as there is room for improvement in this field; therefore, the aim of the research is to propose rational decision-making approaches for an organization HR selection and training choice. Our proposals are based on the training development and competence-based selection approaches created within previous researches i.e. Analytic-Hierarchy Process (AHP) and Linear Programming. Literature review on non-formal education, competence-based selection, AHP form our theoretical background. Some educational service providers in Latvia offer employees training, e.g. motivation, computer skills, accounting, law, ethics, stress management, etc. that are topical for Public Administration. Competence-based approach is a rational base for rational decision-making in both HR selection and considering HR training.

Keywords: competence-based selection, human resource, training, decision-making

Procedia PDF Downloads 337
16710 Study of Landslide Behavior with Topographic Monitoring and Numerical Modeling

Authors: ZerarkaHizia, Akchiche Mustapha, Prunier Florent

Abstract:

Landslide of Ain El Hammam (AEH) has been an old slip since 1969; it was reactivated after an intense rainfall period in 2008 where it presents a complex shape and affects broad areas. The schist of AEH is more or less altered; the alteration is facilitated by the fracturing of the rock in its upper part, the presence of flowing water as well as physical and chemical mechanisms of desegregation in joint of altered schist. The factors following these instabilities are mostly related to the geological formation, the hydro-climatic conditions and the topography of the region. The city of AEH is located on the top of a steep slope at 50 km from the city of TiziOuzou (Algeria). AEH’s topographic monitoring of unstable slope allows analyzing the structure and the different deformation mechanism and the gradual change in the geometry, the direction of change of slip. It also allows us to delimit the area affected by the movement. This work aims to study the behavior of AEH landslide with topographic monitoring and to validate the results with numerical modeling of the slip site, when the hydraulic factors are identified as the most important factors for the reactivation of this landslide. With the help of the numerical code PLAXIS 2D and PlaxFlow, the precipitations and the steady state flow are modeled. To identify the mechanism of deformation and to predict the spread of the AEH landslide numerically, we used the equivalent deviatory strain, and these results were visualized by MATLAB software.

Keywords: equivalent deviatory strain, landslide, numerical modeling, topographic monitoring

Procedia PDF Downloads 292
16709 Unveiling Special Policy Regime, Judgment, and Taylor Rules in Tunisia

Authors: Yosra Baaziz, Moez Labidi

Abstract:

Given limited research on monetary policy rules in revolutionary countries, this paper challenges the suitability of the Taylor rule in characterizing the monetary policy behavior of the Tunisian Central Bank (BCT), especially in turbulent times. More specifically, we investigate the possibility that the Taylor rule should be formulated as a threshold process and examine the validity of such nonlinear Taylor rule as a robust rule for conducting monetary policy in Tunisia. Using quarterly data from 1998:Q4 to 2013:Q4 to analyze the movement of nominal short-term interest rate of the BCT, we find that the nonlinear Taylor rule improves its performance with the advent of special events providing thus a better description of the Tunisian interest rate setting. In particular, our results show that the adoption of an appropriate nonlinear approach leads to a reduction in the errors of 150 basis points in 1999 and 2009, and 60 basis points in 2011, relative to the linear approach.

Keywords: policy rule, central bank, exchange rate, taylor rule, nonlinearity

Procedia PDF Downloads 296
16708 Supervised-Component-Based Generalised Linear Regression with Multiple Explanatory Blocks: THEME-SCGLR

Authors: Bry X., Trottier C., Mortier F., Cornu G., Verron T.

Abstract:

We address component-based regularization of a Multivariate Generalized Linear Model (MGLM). A set of random responses Y is assumed to depend, through a GLM, on a set X of explanatory variables, as well as on a set T of additional covariates. X is partitioned into R conceptually homogeneous blocks X1, ... , XR , viewed as explanatory themes. Variables in each Xr are assumed many and redundant. Thus, Generalised Linear Regression (GLR) demands regularization with respect to each Xr. By contrast, variables in T are assumed selected so as to demand no regularization. Regularization is performed searching each Xr for an appropriate number of orthogonal components that both contribute to model Y and capture relevant structural information in Xr. We propose a very general criterion to measure structural relevance (SR) of a component in a block, and show how to take SR into account within a Fisher-scoring-type algorithm in order to estimate the model. We show how to deal with mixed-type explanatory variables. The method, named THEME-SCGLR, is tested on simulated data.

Keywords: Component-Model, Fisher Scoring Algorithm, GLM, PLS Regression, SCGLR, SEER, THEME

Procedia PDF Downloads 396
16707 Computationally Efficient Electrochemical-Thermal Li-Ion Cell Model for Battery Management System

Authors: Sangwoo Han, Saeed Khaleghi Rahimian, Ying Liu

Abstract:

Vehicle electrification is gaining momentum, and many car manufacturers promise to deliver more electric vehicle (EV) models to consumers in the coming years. In controlling the battery pack, the battery management system (BMS) must maintain optimal battery performance while ensuring the safety of a battery pack. Tasks related to battery performance include determining state-of-charge (SOC), state-of-power (SOP), state-of-health (SOH), cell balancing, and battery charging. Safety related functions include making sure cells operate within specified, static and dynamic voltage window and temperature range, derating power, detecting faulty cells, and warning the user if necessary. The BMS often utilizes an RC circuit model to model a Li-ion cell because of its robustness and low computation cost among other benefits. Because an equivalent circuit model such as the RC model is not a physics-based model, it can never be a prognostic model to predict battery state-of-health and avoid any safety risk even before it occurs. A physics-based Li-ion cell model, on the other hand, is more capable at the expense of computation cost. To avoid the high computation cost associated with a full-order model, many researchers have demonstrated the use of a single particle model (SPM) for BMS applications. One drawback associated with the single particle modeling approach is that it forces to use the average current density in the calculation. The SPM would be appropriate for simulating drive cycles where there is insufficient time to develop a significant current distribution within an electrode. However, under a continuous or high-pulse electrical load, the model may fail to predict cell voltage or Li⁺ plating potential. To overcome this issue, a multi-particle reduced-order model is proposed here. The use of multiple particles combined with either linear or nonlinear charge-transfer reaction kinetics enables to capture current density distribution within an electrode under any type of electrical load. To maintain computational complexity like that of an SPM, governing equations are solved sequentially to minimize iterative solving processes. Furthermore, the model is validated against a full-order model implemented in COMSOL Multiphysics.

Keywords: battery management system, physics-based li-ion cell model, reduced-order model, single-particle and multi-particle model

Procedia PDF Downloads 111
16706 A Large Dataset Imputation Approach Applied to Country Conflict Prediction Data

Authors: Benjamin Leiby, Darryl Ahner

Abstract:

This study demonstrates an alternative stochastic imputation approach for large datasets when preferred commercial packages struggle to iterate due to numerical problems. A large country conflict dataset motivates the search to impute missing values well over a common threshold of 20% missingness. The methodology capitalizes on correlation while using model residuals to provide the uncertainty in estimating unknown values. Examination of the methodology provides insight toward choosing linear or nonlinear modeling terms. Static tolerances common in most packages are replaced with tailorable tolerances that exploit residuals to fit each data element. The methodology evaluation includes observing computation time, model fit, and the comparison of known values to replaced values created through imputation. Overall, the country conflict dataset illustrates promise with modeling first-order interactions while presenting a need for further refinement that mimics predictive mean matching.

Keywords: correlation, country conflict, imputation, stochastic regression

Procedia PDF Downloads 120
16705 Non-Linear Control in Positioning of PMLSM by Estimates of the Load Force by MRAS Method

Authors: Maamar Yahiaoui, Abdelrrahmene Kechich, Ismail Elkhallile Bousserhene

Abstract:

This article presents a study in simulation by means of MATLAB/Simulink software of the nonlinear control in positioning of a linear synchronous machine with the esteemed force of load, to have effective control in the estimator in all tests the wished trajectory follows and the disturbance of load start. The results of simulation prove clearly that the control proposed can detect the reference of positioning the value estimates of load force equal to the actual value.

Keywords: mathematical model, Matlab, PMLSM, control, linearization, estimator, force, load, current

Procedia PDF Downloads 608
16704 Effect of Linear Thermal Gradient on Steady-State Creep Behavior of Isotropic Rotating Disc

Authors: Minto Rattan, Tania Bose, Neeraj Chamoli

Abstract:

The present paper investigates the effect of linear thermal gradient on the steady-state creep behavior of rotating isotropic disc using threshold stress based Sherby’s creep law. The composite discs made of aluminum matrix reinforced with silicon carbide particulate has been taken for analysis. The stress and strain rate distributions have been calculated for discs rotating at linear thermal gradation using von Mises’ yield criterion. The material parameters have been estimated by regression fit of the available experimental data. The results are displayed and compared graphically in designer friendly format for the above said temperature profile with the disc operating under uniform temperature profile. It is observed that radial and tangential stresses show minor variation and the strain rates vary significantly in the presence of thermal gradation as compared to disc having uniform temperature.

Keywords: creep, isotropic, steady-state, thermal gradient

Procedia PDF Downloads 269
16703 Quantitative Structure–Activity Relationship Analysis of Some Benzimidazole Derivatives by Linear Multivariate Method

Authors: Strahinja Z. Kovačević, Lidija R. Jevrić, Sanja O. Podunavac Kuzmanović

Abstract:

The relationship between antibacterial activity of eighteen different substituted benzimidazole derivatives and their molecular characteristics was studied using chemometric QSAR (Quantitative Structure–Activity Relationships) approach. QSAR analysis has been carried out on inhibitory activity towards Staphylococcus aureus, by using molecular descriptors, as well as minimal inhibitory activity (MIC). Molecular descriptors were calculated from the optimized structures. Principal component analysis (PCA) followed by hierarchical cluster analysis (HCA) and multiple linear regression (MLR) was performed in order to select molecular descriptors that best describe the antibacterial behavior of the compounds investigated, and to determine the similarities between molecules. The HCA grouped the molecules in separated clusters which have the similar inhibitory activity. PCA showed very similar classification of molecules as the HCA, and displayed which descriptors contribute to that classification. MLR equations, that represent MIC as a function of the in silico molecular descriptors were established. The statistical significance of the estimated models was confirmed by standard statistical measures and cross-validation parameters (SD = 0.0816, F = 46.27, R = 0.9791, R2CV = 0.8266, R2adj = 0.9379, PRESS = 0.1116). These parameters indicate the possibility of application of the established chemometric models in prediction of the antibacterial behaviour of studied derivatives and structurally very similar compounds.

Keywords: antibacterial, benzimidazole, molecular descriptors, QSAR

Procedia PDF Downloads 364
16702 Why Do We Need Hierachical Linear Models?

Authors: Mustafa Aydın, Ali Murat Sunbul

Abstract:

Hierarchical or nested data structures usually are seen in many research areas. Especially, in the field of education, if we examine most of the studies, we can see the nested structures. Students in classes, classes in schools, schools in cities and cities in regions are similar nested structures. In a hierarchical structure, students being in the same class, sharing the same physical conditions and similar experiences and learning from the same teachers, they demonstrate similar behaviors between them rather than the students in other classes.

Keywords: hierarchical linear modeling, nested data, hierarchical structure, data structure

Procedia PDF Downloads 652
16701 Rehabilitation Team after Brain Damages as Complex System Integrating Consciousness

Authors: Olga Maksakova

Abstract:

A work with unconscious patients after acute brain damages besides special knowledge and practical skills of all the participants requires a very specific organization. A lot of said about team approach in neurorehabilitation, usually as for outpatient mode. Rehabilitologists deal with fixed patient problems or deficits (motion, speech, cognitive or emotional disorder). Team-building means superficial paradigm of management psychology. Linear mode of teamwork fits casual relationships there. Cases with deep altered states of consciousness (vegetative states, coma, and confusion) require non-linear mode of teamwork: recovery of consciousness might not be the goal due to phenomenon uncertainty. Rehabilitation team as Semi-open Complex System includes the patient as a part. Patient's response pattern becomes formed not only with brain deficits but questions-stimuli, context, and inquiring person. Teamwork is sourcing of phenomenology knowledge of patient's processes as Third-person approach is replaced with Second- and after First-person approaches. Here is a chance for real-time change. Patient’s contacts with his own body and outward things create a basement for restoration of consciousness. The most important condition is systematic feedbacks to any minimal movement or vegetative signal of the patient. Up to now, recovery work with the most severe contingent is carried out in the mode of passive physical interventions, while an effective rehabilitation team should include specially trained psychologists and psychotherapists. It is they who are able to create a network of feedbacks with the patient and inter-professional ones building up the team. Characteristics of ‘Team-Patient’ system (TPS) are energy, entropy, and complexity. Impairment of consciousness as the absence of linear contact appears together with a loss of essential functions (low energy), vegetative-visceral fits (excessive energy and low order), motor agitation (excessive energy and excessive order), etc. Techniques of teamwork are different in these cases for resulting optimization of the system condition. Directed regulation of the system complexity is one of the recovery tools. Different signs of awareness appear as a result of system self-organization. Joint meetings are an important part of teamwork. Regular or event-related discussions form the language of inter-professional communication, as well as the patient's shared mental model. Analysis of complex communication process in TPS may be useful for creation of the general theory of consciousness.

Keywords: rehabilitation team, urgent rehabilitation, severe brain damage, consciousness disorders, complex system theory

Procedia PDF Downloads 146
16700 Prevalence of Cerebral Microbleeds in Apparently Healthy, Elderly Population: A Meta-Analysis

Authors: Vidishaa Jali, Amit Sinha, Kameshwar Prasad

Abstract:

Background and Objective: Cerebral microbleeds are frequently found in healthy elderly individuals. We performed a meta- analysis to determine the prevalence of cerebral microbleeds in apparently healthy, elderly population and to determine the effect of age, smoking and hypertension on the occurrence of cerebral microbleeds. Methods: Relevant literature was searched using electronic databases such as MEDLINE, EMBASE, PubMed, Cochrane database, Google scholar to identify studies on the prevalence of cerebral microbleeds in general elderly population till March 2016. STATA version 13 software was used for analysis. Fixed effect model was used if heterogeneity was less than 50%. Otherwise, random effect model was used. Meta- regression analysis was performed to check any effect of important variables such as age, smoking, hypertension. Selection Criteria: We included cross-sectional studies performed in apparently healthy elderly population, who had age more than 50 years. Results: The pooled proportion of cerebral microbleeds in healthy population is 12% (95% CI, 0.11 to 0.13). No significant effect of age was found on the prevalence of cerebral microbleeds (p= 0.99). A linear relationship between increase in hypertension and the prevalence of cerebral microbleeds was found, however, this linear relationship was not statistically significant (p=0.16). Similarly, A linear relationship between increase in smoking and the prevalence of cerebral microbleeds was found, however, this linear relationship was also not statistically significant (p=0.21). Conclusion: Presence of cerebral microbleeds is evident in apparently healthy, elderly population, in more than 10% of individuals.

Keywords: apparently healthy, elderly, prevalence, cerebral microbleeds

Procedia PDF Downloads 296
16699 Reducing Uncertainty of Monte Carlo Estimated Fatigue Damage in Offshore Wind Turbines Using FORM

Authors: Jan-Tore H. Horn, Jørgen Juncher Jensen

Abstract:

Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue estimations may be improved for the same computational efforts. The method is applied to a bottom-fixed, monopile-supported large offshore wind turbine, which is a non-linear and dynamically sensitive system. Different curve fitting techniques to the fatigue damage distribution have been used depending on the sea-state dependent response characteristics, and the effect of a bi-linear S-N curve is discussed. Finally, analyses are performed on several environmental conditions to investigate the long-term applicability of this multistep method. Wave loads are calculated using state-of-the-art theory, while wind loads are applied with a simplified model based on rotor thrust coefficients.

Keywords: fatigue damage, FORM, monopile, Monte Carlo, simulation, wind turbine

Procedia PDF Downloads 260
16698 Symbolic Partial Differential Equations Analysis Using Mathematica

Authors: Davit Shahnazaryan, Diogo Gomes, Mher Safaryan

Abstract:

Many symbolic computations and manipulations required in the analysis of partial differential equations (PDE) or systems of PDEs are tedious and error-prone. These computations arise when determining conservation laws, entropies or integral identities, which are essential tools for the study of PDEs. Here, we discuss a new Mathematica package for the symbolic analysis of PDEs that automate multiple tasks, saving time and effort. Methodologies: During the research, we have used concepts of linear algebra and partial differential equations. We have been working on creating algorithms based on theoretical mathematics to find results mentioned below. Major Findings: Our package provides the following functionalities; finding symmetry group of different PDE systems, generation of polynomials invariant with respect to different symmetry groups; simplification of integral quantities by integration by parts and null Lagrangian cleaning, computing general forms of expressions by integration by parts; finding equivalent forms of an integral expression that are simpler or more symmetric form; determining necessary and sufficient conditions on the coefficients for the positivity of a given symbolic expression. Conclusion: Using this package, we can simplify integral identities, find conserved and dissipated quantities of time-dependent PDE or system of PDEs. Some examples in the theory of mean-field games and semiconductor equations are discussed.

Keywords: partial differential equations, symbolic computation, conserved and dissipated quantities, mathematica

Procedia PDF Downloads 163
16697 Human 3D Metastatic Melanoma Models for in vitro Evaluation of Targeted Therapy Efficiency

Authors: Delphine Morales, Florian Lombart, Agathe Truchot, Pauline Maire, Pascale Vigneron, Antoine Galmiche, Catherine Lok, Muriel Vayssade

Abstract:

Targeted therapy molecules are used as a first-line treatment for metastatic melanoma with B-Raf mutation. Nevertheless, these molecules can cause side effects to patients and are efficient on 50 to 60 % of them. Indeed, melanoma cell sensitivity to targeted therapy molecules is dependent on tumor microenvironment (cell-cell and cell-extracellular matrix interactions). To better unravel factors modulating cell sensitivity to B-Raf inhibitor, we have developed and compared several melanoma models: from metastatic melanoma cells cultured as monolayer (2D) to a co-culture in a 3D dermal equivalent. Cell response was studied in different melanoma cell lines such as SK-MEL-28 (mutant B-Raf (V600E), sensitive to Vemurafenib), SK-MEL-3 (mutant B-Raf (V600E), resistant to Vemurafenib) and a primary culture of dermal human fibroblasts (HDFn). Assays have initially been performed in a monolayer cell culture (2D), then a second time on a 3D dermal equivalent (dermal human fibroblasts embedded in a collagen gel). All cell lines were treated with Vemurafenib (a B-Raf inhibitor) for 48 hours at various concentrations. Cell sensitivity to treatment was assessed under various aspects: Cell proliferation (cell counting, EdU incorporation, MTS assay), MAPK signaling pathway analysis (Western-Blotting), Apoptosis (TUNEL), Cytokine release (IL-6, IL-1α, HGF, TGF-β, TNF-α) upon Vemurafenib treatment (ELISA) and histology for 3D models. In 2D configuration, the inhibitory effect of Vemurafenib on cell proliferation was confirmed on SK-MEL-28 cells (IC50=0.5 µM), and not on the SK-MEL-3 cell line. No apoptotic signal was detected in SK-MEL-28-treated cells, suggesting a cytostatic effect of the Vemurafenib rather than a cytotoxic one. The inhibition of SK-MEL-28 cell proliferation upon treatment was correlated with a strong expression decrease of phosphorylated proteins involved in the MAPK pathway (ERK, MEK, and AKT/PKB). Vemurafenib (from 5 µM to 10 µM) also slowed down HDFn proliferation, whatever cell culture configuration (monolayer or 3D dermal equivalent). SK-MEL-28 cells cultured in the dermal equivalent were still sensitive to high Vemurafenib concentrations. To better characterize all cell population impacts (melanoma cells, dermal fibroblasts) on Vemurafenib efficacy, cytokine release is being studied in 2D and 3D models. We have successfully developed and validated a relevant 3D model, mimicking cutaneous metastatic melanoma and tumor microenvironment. This 3D melanoma model will become more complex by adding a third cell population, keratinocytes, allowing us to characterize the epidermis influence on the melanoma cell sensitivity to Vemurafenib. In the long run, the establishment of more relevant 3D melanoma models with patients’ cells might be useful for personalized therapy development. The authors would like to thank the Picardie region and the European Regional Development Fund (ERDF) 2014/2020 for the funding of this work and Oise committee of "La ligue contre le cancer".

Keywords: 3D human skin model, melanoma, tissue engineering, vemurafenib efficiency

Procedia PDF Downloads 305