Search results for: Sequential Linear Programming (SLP)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4361

Search results for: Sequential Linear Programming (SLP)

4061 Synthesis and Characterization of Cyclic PNC-28 Peptide, Residues 17–26 (ETFSDLWKLL), A Binding Domain of p53

Authors: Deepshikha Verma, V. N. Rajasekharan Pillai

Abstract:

The present study reports the synthesis of cyclic PNC-28 peptides with solid-phase peptide synthesis method. In the first step, we synthesize the linear PNC-28 Peptide and in the second step, we cyclize (N-to-C or head-to-tail cyclization) the linear PNC-28 peptide. The molecular formula of cyclic PNC-28 peptide is C64H88N12O16 and its m/z mass is ≈1233.64. Elemental analysis of cyclic PNC-28 is C, 59.99; H, 6.92; N, 13.12; O, 19.98. The characterization of LC-MS, CD, FT-IR, and 1HNMR has been done to confirm the successful synthesis and cyclization of linear PNC-28 peptides.

Keywords: CD, FTIR, 1HNMR, cyclic peptide

Procedia PDF Downloads 97
4060 Seismic Performance Point of RC Frame Buildings Using ATC-40, FEMA 356 and FEMA 440 Guidelines

Authors: Gram Y. Rivas Sanchez

Abstract:

The seismic design codes in the world allow the analysis of structures considering an elastic-linear behavior; however, against earthquakes, the structures exhibit non-linear behaviors that induce damage to their elements. For this reason, it is necessary to use non-linear methods to analyze these structures, being the dynamic methods that provide more reliable results but require a lot of computational costs; on the other hand, non-linear static methods do not have this disadvantage and are being used more and more. In the present work, the nonlinear static analysis (pushover) of RC frame buildings of three, five, and seven stories is carried out considering models of concentrated plasticity using plastic hinges; and the seismic performance points are determined using ATC-40, FEMA 356, and FEMA 440 guidelines. Using this last standard, the highest inelastic displacements and basal shears are obtained, providing designs that are more conservative.

Keywords: pushover, nonlinear, RC building, FEMA 440, ATC 40

Procedia PDF Downloads 114
4059 Vibration Control of Two Adjacent Structures Using a Non-Linear Damping System

Authors: Soltani Amir, Wang Xuan

Abstract:

The advantage of using non-linear passive damping system in vibration control of two adjacent structures is investigated under their base excitation. The base excitation is El Centro earthquake record acceleration. The damping system is considered as an optimum and effective non-linear viscous damper that is connected between two adjacent structures. A Matlab program is developed to produce the stiffness and damping matrices and to determine a time history analysis of the dynamic motion of the system. One structure is assumed to be flexible while the other has a rule as laterally supporting structure with rigid frames. The response of the structure has been calculated and the non-linear damping coefficient is determined using optimum LQR algorithm in an optimum vibration control system. The non-linear parameter of damping system is estimated and it has shown a significant advantage of application of this system device for vibration control of two adjacent tall building.

Keywords: active control, passive control, viscous dampers, structural control, vibration control, tall building

Procedia PDF Downloads 480
4058 Coupling Time-Domain Analysis for Dynamic Positioning during S-Lay Installation

Authors: Sun Li-Ping, Zhu Jian-Xun, Liu Sheng-Nan

Abstract:

In order to study the performance of dynamic positioning system during S-lay operations, dynamic positioning system is simulated with the hull-stinger-pipe coupling effect. The roller of stinger is simulated by the generalized elastic contact theory. The stinger is composed of Morrison members. Force on pipe is calculated by lumped mass method. Time domain of fully coupled barge model is analyzed combining with PID controller, Kalman filter and allocation of thrust using Sequential Quadratic Programming method. It is also analyzed that the effect of hull wave frequency motion on pipe-stinger coupling force and dynamic positioning system. Besides, it is studied that how S-lay operations affect the dynamic positioning accuracy. The simulation results are proved to be available by checking pipe stress with API criterion. The effect of heave and yaw motion cannot be ignored on hull-stinger-pipe coupling force and dynamic positioning system. It is important to decrease the barge’s pitch motion and lay pipe in head sea in order to improve safety of the S-lay installation and dynamic positioning.

Keywords: S-lay operation, dynamic positioning, coupling motion, time domain, allocation of thrust

Procedia PDF Downloads 431
4057 A User-Directed Approach to Optimization via Metaprogramming

Authors: Eashan Hatti

Abstract:

In software development, programmers often must make a choice between high-level programming and high-performance programs. High-level programming encourages the use of complex, pervasive abstractions. However, the use of these abstractions degrades performance-high performance demands that programs be low-level. In a compiler, the optimizer attempts to let the user have both. The optimizer takes high-level, abstract code as an input and produces low-level, performant code as an output. However, there is a problem with having the optimizer be a built-in part of the compiler. Domain-specific abstractions implemented as libraries are common in high-level languages. As a language’s library ecosystem grows, so does the number of abstractions that programmers will use. If these abstractions are to be performant, the optimizer must be extended with new optimizations to target them, or these abstractions must rely on existing general-purpose optimizations. The latter is often not as effective as needed. The former presents too significant of an effort for the compiler developers, as they are the only ones who can extend the language with new optimizations. Thus, the language becomes more high-level, yet the optimizer – and, in turn, program performance – falls behind. Programmers are again confronted with a choice between high-level programming and high-performance programs. To investigate a potential solution to this problem, we developed Peridot, a prototype programming language. Peridot’s main contribution is that it enables library developers to easily extend the language with new optimizations themselves. This allows the optimization workload to be taken off the compiler developers’ hands and given to a much larger set of people who can specialize in each problem domain. Because of this, optimizations can be much more effective while also being much more numerous. To enable this, Peridot supports metaprogramming designed for implementing program transformations. The language is split into two fragments or “levels”, one for metaprogramming, the other for high-level general-purpose programming. The metaprogramming level supports logic programming. Peridot’s key idea is that optimizations are simply implemented as metaprograms. The meta level supports several specific features which make it particularly suited to implementing optimizers. For instance, metaprograms can automatically deduce equalities between the programs they are optimizing via unification, deal with variable binding declaratively via higher-order abstract syntax, and avoid the phase-ordering problem via non-determinism. We have found that this design centered around logic programming makes optimizers concise and easy to write compared to their equivalents in functional or imperative languages. Overall, implementing Peridot has shown that its design is a viable solution to the problem of writing code which is both high-level and performant.

Keywords: optimization, metaprogramming, logic programming, abstraction

Procedia PDF Downloads 59
4056 Behind Fuzzy Regression Approach: An Exploration Study

Authors: Lavinia B. Dulla

Abstract:

The exploration study of the fuzzy regression approach attempts to present that fuzzy regression can be used as a possible alternative to classical regression. It likewise seeks to assess the differences and characteristics of simple linear regression and fuzzy regression using the width of prediction interval, mean absolute deviation, and variance of residuals. Based on the simple linear regression model, the fuzzy regression approach is worth considering as an alternative to simple linear regression when the sample size is between 10 and 20. As the sample size increases, the fuzzy regression approach is not applicable to use since the assumption regarding large sample size is already operating within the framework of simple linear regression. Nonetheless, it can be suggested for a practical alternative when decisions often have to be made on the basis of small data.

Keywords: fuzzy regression approach, minimum fuzziness criterion, interval regression, prediction interval

Procedia PDF Downloads 258
4055 A Robust Optimization Model for the Single-Depot Capacitated Location-Routing Problem

Authors: Abdolsalam Ghaderi

Abstract:

In this paper, the single-depot capacitated location-routing problem under uncertainty is presented. The problem aims to find the optimal location of a single depot and the routing of vehicles to serve the customers when the parameters may change under different circumstances. This problem has many applications, especially in the area of supply chain management and distribution systems. To get closer to real-world situations, travel time of vehicles, the fixed cost of vehicles usage and customers’ demand are considered as a source of uncertainty. A combined approach including robust optimization and stochastic programming was presented to deal with the uncertainty in the problem at hand. For this purpose, a mixed integer programming model is developed and a heuristic algorithm based on Variable Neighborhood Search(VNS) is presented to solve the model. Finally, the computational results are presented and future research directions are discussed.

Keywords: location-routing problem, robust optimization, stochastic programming, variable neighborhood search

Procedia PDF Downloads 246
4054 Chaotic Search Optimal Design and Modeling of Permanent Magnet Synchronous Linear Motor

Authors: Yang Yi-Fei, Luo Min-Zhou, Zhang Fu-Chun, He Nai-Bao, Xing Shao-Bang

Abstract:

This paper presents an electromagnetic finite element model of permanent magnet synchronous linear motor and distortion rate of the air gap flux density waveform is analyzed in detail. By designing the sample space of the parameters, nonlinear regression modeling of the orthogonal experimental design is introduced. We put forward for possible air gap flux density waveform sine electromagnetic scheme. Parameters optimization of the permanent magnet synchronous linear motor is also introduced which is based on chaotic search and adaptation function. Simulation results prove that the pole shifting does not affect the motor back electromotive symmetry based on the structural parameters, it provides a novel way for the optimum design of permanent magnet synchronous linear motor and other engineering.

Keywords: permanent magnet synchronous linear motor, finite element analysis, chaotic search, optimization design

Procedia PDF Downloads 385
4053 From Linear to Nonlinear Deterrence: Deterrence for Rising Power

Authors: Farhad Ghasemi

Abstract:

Along with transforming the international system into a complex and chaotic system, the fundamental question arises: how can deterrence be reconstructed conceptually and theoretically in this system model? The deterrence system is much more complex today than it was seven decades ago. This article suggests that the perception of deterrence as a linear system is a fundamental mistake because it does not consider the new dynamics of the international system, including network power dynamics. The author aims to improve this point by focusing on complexity and chaos theories, especially their nonlinearity and cascading failure principles. This article proposes that the perception of deterrence as a linear system is a fundamental mistake, as the new dynamics of the surrounding international system do not take into account. The author recognizes deterrence as a nonlinear system and introduces it as a concept in strategic studies.

Keywords: complexity, international system, deterrence, linear deterrence, nonlinear deterrence

Procedia PDF Downloads 116
4052 Adaptive Programming for Indigenous Early Learning: The Early Years Model

Authors: Rachel Buchanan, Rebecca LaRiviere

Abstract:

Context: The ongoing effects of colonialism continue to be experienced through paternalistic policies and funding processes that cause disjuncture between and across Indigenous early childhood programming on-reserve and in urban and Northern settings in Canada. While various educational organizations and social service providers have risen to address these challenges in the short, medium and long term, there continues to be a lack in nation-wide cohesive, culturally grounded, and meaningful early learning programming for Indigenous children in Canada. Indigenous-centered early learning programs tend to face one of two scaling dilemmas: their program goals are too prescriptive to enable the program to be meaningfully replicated in different cultural/ community settings, or their program goals are too broad to be meaningfully adapted to the unique cultural and contextual needs and desires of Indigenous communities (the “franchise approach”). There are over 600 First Nations communities in Canada representing more than 50 Nations and languages. Consequently, Indigenous early learning programming cannot be applied with a universal or “one size fits all” approach. Sustainable and comprehensive programming must be responsive to each community context, building upon existing strengths and assets to avoid program duplication and irrelevance. Thesis: Community-driven and culturally adapted early childhood programming is critical but cannot be achieved on a large scale within traditional program models that are constrained by prescriptive overarching program goals. Principles, rather than goals, are an effective way to navigate and evaluate complex and dynamic systems. Principles guide an intervention to be adaptable, flexible and scalable. The Martin Family Initiative (MFI) ’s Early Years program engages a principles-based approach to programming. As will be discussed in this paper, this approach enables the program to catalyze existing community-based strengths and organizational assets toward bridging gaps across and disjuncture between Indigenous early learning programs, as well as to scale programming in sustainable, context-responsive and dynamic ways. This paper argues that using a principles-driven and adaptive scaling approach, the Early Years model establishes important learnings for culturally adapted Indigenous early learning programming in Canada. Methodology: The Early Years has leveraged this approach to develop an array of programming with partner organizations and communities across the country. The Early Years began as a singular pilot project in one First Nation. In just three years, it has expanded to five different regions and community organizations. In each context, the program supports the partner organization through different means and to different ends, the extent to which is determined in partnership with each community-based organization: in some cases, this means supporting the organization to build home visiting programming from the ground-up; in others, it means offering organization-specific culturally adapted early learning resources to support the programming that already exists in communities. Principles underpin but do not define the practices of the program in each of these relationships. This paper will explore numerous examples of principles-based adaptability with the context of the Early Years, concluding that the program model offers theadaptability and dynamism necessary to respond to unique and ever-evolving community contexts and needs of Indigenous children today.

Keywords: culturally adapted programming, indigenous early learning, principles-based approach, program scaling

Procedia PDF Downloads 151
4051 TutorBot+: Automatic Programming Assistant with Positive Feedback based on LLMs

Authors: Claudia Martínez-Araneda, Mariella Gutiérrez, Pedro Gómez, Diego Maldonado, Alejandra Segura, Christian Vidal-Castro

Abstract:

The purpose of this document is to showcase the preliminary work in developing an EduChatbot-type tool and measuring the effects of its use aimed at providing effective feedback to students in programming courses. This bot, hereinafter referred to as tutorBot+, was constructed based on chatGPT and is tasked with assisting and delivering timely positive feedback to students in the field of computer science at the Universidad Católica de Concepción. The proposed working method consists of four stages: (1) Immersion in the domain of Large Language Models (LLMs), (2) Development of the tutorBot+ prototype and integration, (3) Experiment design, and (4) Intervention. The first stage involves a literature review on the use of artificial intelligence in education and the evaluation of intelligent tutors, as well as research on types of feedback for learning and the domain of chatGPT. The second stage encompasses the development of tutorBot+, and the final stage involves a quasi-experimental study with students from the Programming and Database labs, where the learning outcome involves the development of computational thinking skills, enabling the use and measurement of the tool's effects. The preliminary results of this work are promising, as a functional chatBot prototype has been developed in both conversational and non-conversational versions integrated into an open-source online judge and programming contest platform system. There is also an exploration of the possibility of generating a custom model based on a pre-trained one tailored to the domain of programming. This includes the integration of the created tool and the design of the experiment to measure its utility.

Keywords: assessment, chatGPT, learning strategies, LLMs, timely feedback

Procedia PDF Downloads 37
4050 Building Biodiversity Conservation Plans Robust to Human Land Use Uncertainty

Authors: Yingxiao Ye, Christopher Doehring, Angelos Georghiou, Hugh Robinson, Phebe Vayanos

Abstract:

Human development is a threat to biodiversity, and conservation organizations (COs) are purchasing land to protect areas for biodiversity preservation. However, COs have limited budgets and thus face hard prioritization decisions that are confounded by uncertainty in future human land use. This research proposes a data-driven sequential planning model to help COs choose land parcels that minimize the uncertain human impact on biodiversity. The proposed model is robust to uncertain development, and the sequential decision-making process is adaptive, allowing land purchase decisions to adapt to human land use as it unfolds. The cellular automata model is leveraged to simulate land use development based on climate data, land characteristics, and development threat index from NASA Socioeconomic Data and Applications Center. This simulation is used to model uncertainty in the problem. This research leverages state-of-the-art techniques in the robust optimization literature to propose a computationally tractable reformulation of the model, which can be solved routinely by off-the-shelf solvers like Gurobi or CPLEX. Numerical results based on real data from the Jaguar in Central and South America show that the proposed method reduces conservation loss by 19.46% on average compared to standard approaches such as MARXAN used in practice for biodiversity conservation. Our method may better help guide the decision process in land acquisition and thereby allow conservation organizations to maximize the impact of limited resources.

Keywords: data-driven robust optimization, biodiversity conservation, uncertainty simulation, adaptive sequential planning

Procedia PDF Downloads 180
4049 Geometrically Linear Symmetric Free Vibration Analysis of Sandwich Beam

Authors: Ibnorachid Zakaria, El Bikri Khalid, Benamar Rhali, Farah Abdoun

Abstract:

The aim of the present work is to study the linear free symmetric vibration of three-layer sandwich beam using the energy method. The zigzag model is used to describe the displacement field. The theoretical model is based on the top and bottom layers behave like Euler-Bernoulli beams while the core layer like a Timoshenko beam. Based on Hamilton’s principle, the governing equation of motion sandwich beam is obtained in order to calculate the linear frequency parameters for a clamped-clamped and simple supported-simple-supported beams. The effects of material properties and geometric parameters on the natural frequencies are also investigated.

Keywords: linear vibration, sandwich, shear deformation, Timoshenko zig-zag model

Procedia PDF Downloads 444
4048 Apply Commitment Method in Power System to Minimize the Fuel Cost

Authors: Mohamed Shaban, Adel Yahya

Abstract:

The goal of this paper study is to schedule the power generation units to minimize fuel consumption cost based on a model that solves unit commitment problems. This can be done by utilizing forward dynamic programming method to determine the most economic scheduling of generating units. The model was applied to a power station, which consists of four generating units. The obtained results show that the applications of forward dynamic programming method offer a substantial reduction in fuel consumption cost. The fuel consumption cost has been reduced from $116,326 to $102,181 within a 24-hour period. This means saving about 12.16 % of fuel consumption cost. The study emphasizes the importance of applying modeling schedule programs to the operation of power generation units. As a consequence less consumption of fuel, less loss of power and less pollution

Keywords: unit commitment, forward dynamic, fuel cost, programming, generation scheduling, operation cost, power system, generating units

Procedia PDF Downloads 570
4047 Type–2 Fuzzy Programming for Optimizing the Heat Rate of an Industrial Gas Turbine via Absorption Chiller Technology

Authors: T. Ganesan, M. S. Aris, I. Elamvazuthi, Momen Kamal Tageldeen

Abstract:

Terms set in power purchase agreements (PPA) challenge power utility companies in balancing between the returns (from maximizing power production) and securing long term supply contracts at capped production. The production limitation set in the PPA has driven efforts to maximize profits through efficient and economic power production. In this paper, a combined industrial-scale gas turbine (GT) - absorption chiller (AC) system is considered to cool the GT air intake for reducing the plant’s heat rate (HR). This GT-AC system is optimized while considering power output limitations imposed by the PPA. In addition, the proposed formulation accounts for uncertainties in the ambient temperature using Type-2 fuzzy programming. Using the enhanced chaotic differential evolution (CEDE), the Pareto frontier was constructed and the optimization results are analyzed in detail.

Keywords: absorption chillers (AC), turbine inlet air cooling (TIC), power purchase agreement (PPA), multiobjective optimization, type-2 fuzzy programming, chaotic differential evolution (CDDE)

Procedia PDF Downloads 277
4046 Anticandidal and Antibacterial Silver and Silver(Core)-Gold(Shell) Bimetallic Nanoparticles by Fusarium graminearum

Authors: Dipali Nagaonkar, Mahendra Rai

Abstract:

Nanotechnology has experienced significant developments in engineered nanomaterials in the core-shell arrangement. Nanomaterials having nanolayers of silver and gold are of primary interest due to their wide applications in catalytical and biomedical fields. Further, mycosynthesis of nanoparticles has been proved as a sustainable synthetic approach of nanobiotechnology. In this context, we have synthesized silver and silver (core)-gold (shell) bimetallic nanoparticles using a fungal extract of Fusarium graminearum by sequential reduction. The core-shell deposition of nanoparticles was confirmed by the red shift in the surface plasmon resonance from 434 nm to 530 nm with the aid of the UV-Visible spectrophotometer. The mean particle size of Ag and Ag-Au nanoparticles was confirmed by nanoparticle tracking analysis as 37 nm and 50 nm respectively. Quite polydispersed and spherical nanoparticles are evident by TEM analysis. These mycosynthesized bimetallic nanoparticles were tested against some pathogenic bacteria and Candida sp. The antimicrobial analysis confirmed enhanced anticandidal and antibacterial potential of bimetallic nanoparticles over their monometallic counterparts.

Keywords: bimetallic nanoparticles, core-shell arrangement, mycosynthesis, sequential reduction

Procedia PDF Downloads 543
4045 Bayesian Network and Feature Selection for Rank Deficient Inverse Problem

Authors: Kyugneun Lee, Ikjin Lee

Abstract:

Parameter estimation with inverse problem often suffers from unfavorable conditions in the real world. Useless data and many input parameters make the problem complicated or insoluble. Data refinement and reformulation of the problem can solve that kind of difficulties. In this research, a method to solve the rank deficient inverse problem is suggested. A multi-physics system which has rank deficiency caused by response correlation is treated. Impeditive information is removed and the problem is reformulated to sequential estimations using Bayesian network (BN) and subset groups. At first, subset grouping of the responses is performed. Feature selection with singular value decomposition (SVD) is used for the grouping. Next, BN inference is used for sequential conditional estimation according to the group hierarchy. Directed acyclic graph (DAG) structure is organized to maximize the estimation ability. Variance ratio of response to noise is used to pairing the estimable parameters by each response.

Keywords: Bayesian network, feature selection, rank deficiency, statistical inverse analysis

Procedia PDF Downloads 284
4044 Optimal Placement of Phasor Measurement Units (PMU) Using Mixed Integer Programming (MIP) for Complete Observability in Power System Network

Authors: Harshith Gowda K. S, Tejaskumar N, Shubhanga R. B, Gowtham N, Deekshith Gowda H. S

Abstract:

Phasor measurement units (PMU) are playing an important role in the current power system for state estimation. It is necessary to have complete observability of the power system while minimizing the cost. For this purpose, the optimal location of the phasor measurement units in the power system is essential. In a bus system, zero injection buses need to be evaluated to minimize the number of PMUs. In this paper, the optimization problem is formulated using mixed integer programming to obtain the optimal location of the PMUs with increased observability. The formulation consists of with and without zero injection bus as constraints. The formulated problem is simulated using a CPLEX solver in the GAMS software package. The proposed method is tested on IEEE 30, IEEE 39, IEEE 57, and IEEE 118 bus systems. The results obtained show that the number of PMUs required is minimal with increased observability.

Keywords: PMU, observability, mixed integer programming (MIP), zero injection buses (ZIB)

Procedia PDF Downloads 143
4043 Over the Air Programming Method for Learning Wireless Sensor Networks

Authors: K. Sangeeth, P. Rekha, P. Preeja, P. Divya, R. Arya, R. Maneesha

Abstract:

Wireless sensor networks (WSN) are small or tiny devices that consists of different sensors to sense physical parameters like air pressure, temperature, vibrations, movement etc., process these data and sends it to the central data center to take decisions. The WSN domain, has wide range of applications such as monitoring and detecting natural hazards like landslides, forest fire, avalanche, flood monitoring and also in healthcare applications. With such different applications, it is being taught in undergraduate/post graduate level in many universities under department of computer science. But the cost and infrastructure required to purchase WSN nodes for having the students getting hands on expertise on these devices is expensive. This paper gives overview about the remote triggered lab that consists of more than 100 WSN nodes that helps the students to remotely login from anywhere in the world using the World Wide Web, configure the nodes and learn the WSN concepts in intuitive way. It proposes new way called over the air programming (OTAP) and its internals that program the 100 nodes simultaneously and view the results without the nodes being physical connected to the computer system, thereby allowing for sparse deployment.

Keywords: WSN, over the air programming, virtual lab, AT45DB

Procedia PDF Downloads 348
4042 Non-Differentiable Mond-Weir Type Symmetric Duality under Generalized Invexity

Authors: Jai Prakash Verma, Khushboo Verma

Abstract:

In the present paper, a pair of Mond-Weir type non-differentiable multiobjective second-order programming problems, involving two kernel functions, where each of the objective functions contains support function, is formulated. We prove weak, strong and converse duality theorem for the second-order symmetric dual programs under η-pseudoinvexity conditions.

Keywords: non-differentiable multiobjective programming, second-order symmetric duality, efficiency, support function, eta-pseudoinvexity

Procedia PDF Downloads 223
4041 Developing Serious Games to Improve Learning Experience of Programming: A Case Study

Authors: Shan Jiang, Xinyu Tang

Abstract:

Game-based learning is an emerging pedagogy to make the learning experience more effective, enjoyable, and fun. However, most games used in classroom settings have been overly simplistic. This paper presents a case study on a Python-based online game designed to improve the effectiveness in both teaching and research in higher education. The proposed game system not only creates a fun and enjoyable experience for students to learn various topics in programming but also improves the effectiveness of teaching in several aspects, including material presentation, helping students to recognize the importance of the subjects, and linking theoretical concepts to practice. The proposed game system also serves as an information cyber-infrastructure that automatically collects and stores data from players. The data could be useful in research areas including human-computer interaction, decision making, opinion mining, and artificial intelligence. They further provide other possibilities beyond these areas due to the customizable nature of the game.

Keywords: game-based learning, programming, research-teaching integration, Hearthstone

Procedia PDF Downloads 139
4040 Improved Blood Glucose-Insulin Monitoring with Dual-Layer Predictive Control Design

Authors: Vahid Nademi

Abstract:

In response to widely used wearable medical devices equipped with a continuous glucose monitor (CGM) and insulin pump, the advanced control methods are still demanding to get the full benefit of these devices. Unlike costly clinical trials, implementing effective insulin-glucose control strategies can provide significant contributions to the patients suffering from chronic diseases such as diabetes. This study deals with a key role of two-layer insulin-glucose regulator based on model-predictive-control (MPC) scheme so that the patient’s predicted glucose profile is in compliance with the insulin level injected through insulin pump automatically. It is achieved by iterative optimization algorithm which is called an integrated perturbation analysis and sequential quadratic programming (IPA-SQP) solver for handling uncertainties due to unexpected variations in glucose-insulin values and body’s characteristics. The feasibility evaluation of the discussed control approach is also studied by means of numerical simulations of two case scenarios via measured data. The obtained results are presented to verify the superior and reliable performance of the proposed control scheme with no negative impact on patient safety.

Keywords: blood glucose monitoring, insulin pump, predictive control, optimization

Procedia PDF Downloads 111
4039 Market Solvency Capital Requirement Minimization: How Non-linear Solvers Provide Portfolios Complying with Solvency II Regulation

Authors: Abraham Castellanos, Christophe Durville, Sophie Echenim

Abstract:

In this article, a portfolio optimization problem is performed in a Solvency II context: it illustrates how advanced optimization techniques can help to tackle complex operational pain points around the monitoring, control, and stability of Solvency Capital Requirement (SCR). The market SCR of a portfolio is calculated as a combination of SCR sub-modules. These sub-modules are the results of stress-tests on interest rate, equity, property, credit and FX factors, as well as concentration on counter-parties. The market SCR is non convex and non differentiable, which does not make it a natural optimization criteria candidate. In the SCR formulation, correlations between sub-modules are fixed, whereas risk-driven portfolio allocation is usually driven by the dynamics of the actual correlations. Implementing a portfolio construction approach that is efficient on both a regulatory and economic standpoint is not straightforward. Moreover, the challenge for insurance portfolio managers is not only to achieve a minimal SCR to reduce non-invested capital but also to ensure stability of the SCR. Some optimizations have already been performed in the literature, simplifying the standard formula into a quadratic function. But to our knowledge, it is the first time that the standard formula of the market SCR is used in an optimization problem. Two solvers are combined: a bundle algorithm for convex non- differentiable problems, and a BFGS (Broyden-Fletcher-Goldfarb- Shanno)-SQP (Sequential Quadratic Programming) algorithm, to cope with non-convex cases. A market SCR minimization is then performed with historical data. This approach results in significant reduction of the capital requirement, compared to a classical Markowitz approach based on the historical volatility. A comparative analysis of different optimization models (equi-risk-contribution portfolio, minimizing volatility portfolio and minimizing value-at-risk portfolio) is performed and the impact of these strategies on risk measures including market SCR and its sub-modules is evaluated. A lack of diversification of market SCR is observed, specially for equities. This was expected since the market SCR strongly penalizes this type of financial instrument. It was shown that this direct effect of the regulation can be attenuated by implementing constraints in the optimization process or minimizing the market SCR together with the historical volatility, proving the interest of having a portfolio construction approach that can incorporate such features. The present results are further explained by the Market SCR modelling.

Keywords: financial risk, numerical optimization, portfolio management, solvency capital requirement

Procedia PDF Downloads 93
4038 Optimization Model for Support Decision for Maximizing Production of Mixed Fruit Tree Farms

Authors: Andrés I. Ávila, Patricia Aros, César San Martín, Elizabeth Kehr, Yovana Leal

Abstract:

We consider a linear programming model to help farmers to decide if it is convinient to choose among three kinds of export fruits for their future investment. We consider area, investment, water, productivitiy minimal unit, and harvest restrictions and a monthly based model to compute the average income in five years. Also, conditions on the field as area, water availability and initia investment are required. Using the Chilean costs and dollar-peso exchange rate, we can simulate several scenarios to understand the possible risks associated to this market.

Keywords: mixed integer problem, fruit production, support decision model, fruit tree farms

Procedia PDF Downloads 429
4037 Biomechanical Analysis and Interpretation of Pitching Sequences for Enhanced Performance Programming

Authors: Corey F. Fitzgerald

Abstract:

This study provides a comprehensive examination of the biomechanical sequencing inherent in pitching motions, coupled with an advanced methodology for interpreting gathered data to inform programming strategies. The analysis is conducted utilizing state-of-the-art biomechanical laboratory equipment capable of detecting subtle changes and deviations, facilitating highly informed decision-making processes. Through this presentation, the intricate dynamics of pitching sequences are meticulously discussed to highlight the complex movement patterns accessible and actionable for performance enhancement purposes in the weight room.

Keywords: sport science, applied biomechanics, strength and conditioning, applied research

Procedia PDF Downloads 25
4036 Adding Business Value in Enterprise Applications through Quality Matrices Using Agile

Authors: Afshan Saad, Muhammad Saad, Shah Muhammad Emaduddin

Abstract:

Nowadays the business condition is so quick paced that enhancing ourselves consistently has turned into a huge factor for the presence of an undertaking. We can check this for structural building and significantly more so in the quick-paced universe of data innovation and programming designing. The lithe philosophies, similar to Scrum, have a devoted advance in the process that objectives the enhancement of the improvement procedure and programming items. Pivotal to process enhancement is to pick up data that grants you to assess the condition of the procedure and its items. From the status data, you can design activities for the upgrade and furthermore assess the accomplishment of those activities. This investigation builds a model that measures the product nature of the improvement procedure. The product quality is dependent on the useful and auxiliary nature of the product items, besides the nature of the advancement procedure is likewise vital to enhance programming quality. Utilitarian quality covers the adherence to client prerequisites, while the auxiliary quality tends to the structure of the product item's source code with reference to its practicality. The procedure quality is identified with the consistency and expectedness of the improvement procedure. The product quality model is connected in a business setting by social occasion the information for the product measurements in the model. To assess the product quality model, we investigate the information and present it to the general population engaged with the light-footed programming improvement process. The outcomes from the application and the client input recommend that the model empowers a reasonable evaluation of the product quality and that it very well may be utilized to help the persistent enhancement of the advancement procedure and programming items.

Keywords: Agile SDLC Tools, Agile Software development, business value, enterprise applications, IBM, IBM Rational Team Concert, RTC, software quality, software metrics

Procedia PDF Downloads 144
4035 Comparison of Parallel CUDA and OpenMP Implementations of Memetic Algorithms for Solving Optimization Problems

Authors: Jason Digalakis, John Cotronis

Abstract:

Memetic algorithms (MAs) are useful for solving optimization problems. It is quite difficult to search the search space of the optimization problem with large dimensions. There is a challenge to use all the cores of the system. In this study, a sequential implementation of the memetic algorithm is converted into a concurrent version, which is executed on the cores of both CPU and GPU. For this reason, CUDA and OpenMP libraries are operated on the parallel algorithm to make a concurrent execution on CPU and GPU, respectively. The aim of this study is to compare CPU and GPU implementation of the memetic algorithm. For this purpose, fourteen benchmark functions are selected as test problems. The obtained results indicate that our approach leads to speedups up to five thousand times higher compared to one CPU thread while maintaining a reasonable results quality. This clearly shows that GPUs have the potential to acceleration of MAs and allow them to solve much more complex tasks.

Keywords: memetic algorithm, CUDA, GPU-based memetic algorithm, open multi processing, multimodal functions, unimodal functions, non-linear optimization problems

Procedia PDF Downloads 56
4034 A Deletion-Cost Based Fast Compression Algorithm for Linear Vector Data

Authors: Qiuxiao Chen, Yan Hou, Ning Wu

Abstract:

As there are deficiencies of the classic Douglas-Peucker Algorithm (DPA), such as high risks of deleting key nodes by mistake, high complexity, time consumption and relatively slow execution speed, a new Deletion-Cost Based Compression Algorithm (DCA) for linear vector data was proposed. For each curve — the basic element of linear vector data, all the deletion costs of its middle nodes were calculated, and the minimum deletion cost was compared with the pre-defined threshold. If the former was greater than or equal to the latter, all remaining nodes were reserved and the curve’s compression process was finished. Otherwise, the node with the minimal deletion cost was deleted, its two neighbors' deletion costs were updated, and the same loop on the compressed curve was repeated till the termination. By several comparative experiments using different types of linear vector data, the comparison between DPA and DCA was performed from the aspects of compression quality and computing efficiency. Experiment results showed that DCA outperformed DPA in compression accuracy and execution efficiency as well.

Keywords: Douglas-Peucker algorithm, linear vector data, compression, deletion cost

Procedia PDF Downloads 212
4033 Optimizing and Evaluating Performance Quality Control of the Production Process of Disposable Essentials Using Approach Vague Goal Programming

Authors: Hadi Gholizadeh, Ali Tajdin

Abstract:

To have effective production planning, it is necessary to control the quality of processes. This paper aims at improving the performance of the disposable essentials process using statistical quality control and goal programming in a vague environment. That is expressed uncertainty because there is always a measurement error in the real world. Therefore, in this study, the conditions are examined in a vague environment that is a distance-based environment. The disposable essentials process in Kach Company was studied. Statistical control tools were used to characterize the existing process for four factor responses including the average of disposable glasses’ weights, heights, crater diameters, and volumes. Goal programming was then utilized to find the combination of optimal factors setting in a vague environment which is measured to apply uncertainty of the initial information when some of the parameters of the models are vague; also, the fuzzy regression model is used to predict the responses of the four described factors. Optimization results show that the process capability index values for disposable glasses’ average of weights, heights, crater diameters and volumes were improved. Such increasing the quality of the products and reducing the waste, which will reduce the cost of the finished product, and ultimately will bring customer satisfaction, and this satisfaction, will mean increased sales.

Keywords: goal programming, quality control, vague environment, disposable glasses’ optimization, fuzzy regression

Procedia PDF Downloads 203
4032 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 465