Search results for: progressive sports operational model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18832

Search results for: progressive sports operational model

17662 A Unified Model for Orotidine Monophosphate Synthesis: Target for Inhibition of Growth of Mycobacterium tuberculosis

Authors: N. Naga Subrahmanyeswara Rao, Parag Arvind Deshpande

Abstract:

Understanding nucleotide synthesis reaction of any organism is beneficial to know the growth of it as in Mycobacterium tuberculosis to design anti TB drug. One of the reactions of de novo pathway which takes place in all organisms was considered. The reaction takes places between phosphoribosyl pyrophosphate and orotate catalyzed by orotate phosphoribosyl transferase and divalent metal ion gives orotdine monophosphate, a nucleotide. All the reaction steps of three experimentally proposed mechanisms for this reaction were considered to develop kinetic rate expression. The model was validated using the data for four organisms. This model could successfully describe the kinetics for the reported data. The developed model can serve as a reliable model to describe the kinetics in new organisms without the need of mechanistic determination. So an organism-independent model was developed.

Keywords: mechanism, nucleotide, organism, tuberculosis

Procedia PDF Downloads 334
17661 Controlling the Expense of Political Contests Using a Modified N-Players Tullock’s Model

Authors: C. Cohen, O. Levi

Abstract:

This work introduces a generalization of the classical Tullock’s model of one-stage contests under complete information with multiple unlimited numbers of contestants. In classical Tullock’s model, the contest winner is not necessarily the highest bidder. Instead, the winner is determined according to a draw in which the winning probabilities are the relative contestants’ efforts. The Tullock modeling fits well political contests, in which the winner is not necessarily the highest effort contestant. This work presents a modified model which uses a simple non-discriminating rule, namely, a parameter to influence the total costs planned for an election, for example, the contest designer can control the contestants' efforts. The winner pays a fee, and the losers are reimbursed the same amount. Our proposed model includes a mechanism that controls the efforts exerted and balances competition, creating a tighter, less predictable and more interesting contest. Additionally, the proposed model follows the fairness criterion in the sense that it does not alter the contestants' probabilities of winning compared to the classic Tullock’s model. We provide an analytic solution for the contestant's optimal effort and expected reward.

Keywords: contests, Tullock's model, political elections, control expenses

Procedia PDF Downloads 145
17660 Fama French Four Factor Model: A Study of Nifty Fifty Companies

Authors: Deeksha Arora

Abstract:

The study aims to explore the applicability of the widely used asset pricing models, namely, Capital Asset Pricing Model (CAPM) and the Fama-French Four Factor Model in the Indian equity market. The study will be based on the companies that form part of the Nifty Fifty Index for a period of five years: 2011 to 2016. The asset pricing model is examined by forming portfolios on the basis of three variables – market capitalization (size effect), book-to-market equity ratio (value effect) and profitability. The study provides a basis to test the presence of the Fama-French Four factor model in Indian stock market. This study may provide a basis for future research in the generalized asset pricing model comprising of multiple risk factors.

Keywords: book to market equity, Fama French four factor model, market capitalization, profitability, size effect, value effect

Procedia PDF Downloads 263
17659 The Effectiveness of a Hybrid Diffie-Hellman-RSA-Advanced Encryption Standard Model

Authors: Abdellahi Cheikh

Abstract:

With the emergence of quantum computers with very powerful capabilities, the security of the exchange of shared keys between two interlocutors poses a big problem in terms of the rapid development of technologies such as computing power and computing speed. Therefore, the Diffie-Hellmann (DH) algorithm is more vulnerable than ever. No mechanism guarantees the security of the key exchange, so if an intermediary manages to intercept it, it is easy to intercept. In this regard, several studies have been conducted to improve the security of key exchange between two interlocutors, which has led to interesting results. The modification made on our model Diffie-Hellman-RSA-AES (DRA), which encrypts the information exchanged between two users using the three-encryption algorithms DH, RSA and AES, by using stenographic photos to hide the contents of the p, g and ClesAES values that are sent in an unencrypted state at the level of DRA model to calculate each user's public key. This work includes a comparative study between the DRA model and all existing solutions, as well as the modification made to this model, with an emphasis on the aspect of reliability in terms of security. This study presents a simulation to demonstrate the effectiveness of the modification made to the DRA model. The obtained results show that our model has a security advantage over the existing solution, so we made these changes to reinforce the security of the DRA model.

Keywords: Diffie-Hellmann, DRA, RSA, advanced encryption standard

Procedia PDF Downloads 93
17658 Project Management Agile Model Based on Project Management Body of Knowledge Guideline

Authors: Mehrzad Abdi Khalife, Iraj Mahdavi

Abstract:

This paper presents the agile model for project management process. For project management process, the Project Management Body of Knowledge (PMBOK) guideline has been selected as platform. Combination of computational science and artificial intelligent methodology has been added to the guideline to transfer the standard to agile project management process. The model is the combination of practical standard, computational science and artificial intelligent. In this model, we present communication model and protocols to keep process agile. Here, we illustrate the collaboration man and machine in project management area with artificial intelligent approach.

Keywords: artificial intelligent, conceptual model, man-machine collaboration, project management, standard

Procedia PDF Downloads 341
17657 Parameter Estimation for the Oral Minimal Model and Parameter Distinctions Between Obese and Non-obese Type 2 Diabetes

Authors: Manoja Rajalakshmi Aravindakshana, Devleena Ghosha, Chittaranjan Mandala, K. V. Venkateshb, Jit Sarkarc, Partha Chakrabartic, Sujay K. Maity

Abstract:

Oral Glucose Tolerance Test (OGTT) is the primary test used to diagnose type 2 diabetes mellitus (T2DM) in a clinical setting. Analysis of OGTT data using the Oral Minimal Model (OMM) along with the rate of appearance of ingested glucose (Ra) is performed to study differences in model parameters for control and T2DM groups. The differentiation of parameters of the model gives insight into the behaviour and physiology of T2DM. The model is also studied to find parameter differences among obese and non-obese T2DM subjects and the sensitive parameters were co-related to the known physiological findings. Sensitivity analysis is performed to understand changes in parameter values with model output and to support the findings, appropriate statistical tests are done. This seems to be the first preliminary application of the OMM with obesity as a distinguishing factor in understanding T2DM from estimated parameters of insulin-glucose model and relating the statistical differences in parameters to diabetes pathophysiology.

Keywords: oral minimal model, OGTT, obese and non-obese T2DM, mathematical modeling, parameter estimation

Procedia PDF Downloads 92
17656 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing

Authors: Khaled Salah

Abstract:

Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.

Keywords: genetic algorithm, simulated annealing, model reduction, transfer function

Procedia PDF Downloads 143
17655 The Effect of Vertical Integration on Operational Performance: Evaluating Physician Employment in Hospitals

Authors: Gary Young, David Zepeda, Gilbert Nyaga

Abstract:

This study investigated whether vertical integration of hospitals and physicians is associated with better care for patients with cardiac conditions. A dramatic change in the U.S. hospital industry is the integration of hospital and physicians through hospital acquisition of physician practices. Yet, there is little evidence regarding whether this form of vertical integration leads to better operational performance of hospitals. The study was conducted as an observational investigation based on a pooled, cross-sectional database. The study sample comprised over hospitals in the State of California. The time frame for the study was 2010 to 2012. The key performance measure was hospitals’ degree of compliance with performance criteria set out by the federal government for managing patients with cardiac conditions. These criteria relate to the types of clinical tests and medications that hospitals should follow for cardiac patients but hospital compliance requires the cooperation of a hospital’s physicians. Data for this measure was obtained from a federal website that presents performance scores for U.S. hospitals. The key independent variable was the percentage of cardiologists that a hospital employs (versus cardiologists who are affiliated but not employed by the hospital). Data for this measure was obtained from the State of California which requires hospitals to report financial and operation data each year including numbers of employed physicians. Other characteristics of hospitals (e.g., information technology for cardiac care, volume of cardiac patients) were also evaluated as possible complements or substitutes for physician employment by hospitals. Additional sources of data included the American Hospital Association and the U.S. Census. Empirical models were estimated with generalized estimating equations (GEE). Findings suggest that physician employment is positively associated with better hospital performance for cardiac care. However, findings also suggest that information technology is a substitute for physician employment.

Keywords: physician employment, hospitals, verical integration, cardiac care

Procedia PDF Downloads 395
17654 Reliability and Availability Analysis of Satellite Data Reception System using Reliability Modeling

Authors: Ch. Sridevi, S. P. Shailender Kumar, B. Gurudayal, A. Chalapathi Rao, K. Koteswara Rao, P. Srinivasulu

Abstract:

System reliability and system availability evaluation plays a crucial role in ensuring the seamless operation of complex satellite data reception system with consistent performance for longer periods. This paper presents a novel approach for the same using a case study on one of the antenna systems at satellite data reception ground station in India. The methodology involves analyzing system's components, their failure rates, system's architecture, generation of logical reliability block diagram model and estimating the reliability of the system using the component level mean time between failures considering exponential distribution to derive a baseline estimate of the system's reliability. The model is then validated with collected system level field failure data from the operational satellite data reception systems that includes failure occurred, failure time, criticality of the failure and repair times by using statistical techniques like median rank, regression and Weibull analysis to extract meaningful insights regarding failure patterns and practical reliability of the system and to assess the accuracy of the developed reliability model. The study mainly focused on identification of critical units within the system, which are prone to failures and have a significant impact on overall performance and brought out a reliability model of the identified critical unit. This model takes into account the interdependencies among system components and their impact on overall system reliability and provides valuable insights into the performance of the system to understand the Improvement or degradation of the system over a period of time and will be the vital input to arrive at the optimized design for future development. It also provides a plug and play framework to understand the effect on performance of the system in case of any up gradations or new designs of the unit. It helps in effective planning and formulating contingency plans to address potential system failures, ensuring the continuity of operations. Furthermore, to instill confidence in system users, the duration for which the system can operate continuously with the desired level of 3 sigma reliability was estimated that turned out to be a vital input to maintenance plan. System availability and station availability was also assessed by considering scenarios of clash and non-clash to determine the overall system performance and potential bottlenecks. Overall, this paper establishes a comprehensive methodology for reliability and availability analysis of complex satellite data reception systems. The results derived from this approach facilitate effective planning contingency measures, and provide users with confidence in system performance and enables decision-makers to make informed choices about system maintenance, upgrades and replacements. It also aids in identifying critical units and assessing system availability in various scenarios and helps in minimizing downtime and optimizing resource allocation.

Keywords: exponential distribution, reliability modeling, reliability block diagram, satellite data reception system, system availability, weibull analysis

Procedia PDF Downloads 84
17653 Towards an Enhanced Compartmental Model for Profiling Malware Dynamics

Authors: Jessemyn Modiini, Timothy Lynar, Elena Sitnikova

Abstract:

We present a novel enhanced compartmental model for malware spread analysis in cyber security. This paper applies cyber security data features to epidemiological compartmental models to model the infectious potential of malware. Compartmental models are most efficient for calculating the infectious potential of a disease. In this paper, we discuss and profile epidemiologically relevant data features from a Domain Name System (DNS) dataset. We then apply these features to epidemiological compartmental models to network traffic features. This paper demonstrates how epidemiological principles can be applied to the novel analysis of key cybersecurity behaviours and trends and provides insight into threat modelling above that of kill-chain analysis. In applying deterministic compartmental models to a cyber security use case, the authors analyse the deficiencies and provide an enhanced stochastic model for cyber epidemiology. This enhanced compartmental model (SUEICRN model) is contrasted with the traditional SEIR model to demonstrate its efficacy.

Keywords: cybersecurity, epidemiology, cyber epidemiology, malware

Procedia PDF Downloads 107
17652 Metachromatic Leukodystrophy: A Case Report

Authors: Mary Rose Eunice S. Gundayao, Manolo M. Fernandez

Abstract:

Metachromatic leukodystrophy (MLD) is a rare lysosomal storage disorder with an autosomal recessive inheritance pattern. Lysosomal storage disorders are often severe, follow a progressively neurodegenerative path, and may result in multi-organ failure, potentially leading to death within 5 to 6 years in cases of early-onset forms. There are limited data regarding cases of MLD in Filipino children. This is the case of a 2-year-old Filipino girl who presented with progressive neurological deterioration and was diagnosed with metachromatic leukodystrophy by molecular genetic testing. This case report aims to present this patient’s clinical history, neurological findings, diagnosis and novel genetic mutations causing MLD. A concise review of updated literature on MLD will be discussed.

Keywords: metachromatic leukodystrophy, ARSA gene, peripheral neuropathy, case report, demyelinating disease

Procedia PDF Downloads 19
17651 Hidden Oscillations in the Mathematical Model of the Optical Binary Phase Shift Keying (BPSK) Costas Loop

Authors: N. V. Kuznetsov, O. A. Kuznetsova, G. A. Leonov, M. V. Yuldashev, R. V. Yuldashev

Abstract:

Nonlinear analysis of the phase locked loop (PLL)-based circuits is a challenging task. Thus, the simulation is widely used for their study. In this work, we consider a mathematical model of the optical Costas loop and demonstrate the limitations of simulation approach related to the existence of so-called hidden oscillations in the phase space of the model.

Keywords: optical Costas loop, mathematical model, simulation, hidden oscillation

Procedia PDF Downloads 440
17650 Efficacy of Ivermectin Agaist Sarcoptes Scabiei Var. Cameli in Libya

Authors: Ahmed Rashed

Abstract:

Sarcoptic mange is generally recognized as one of the most serious diseases in camels in Libya. It is an extremely pruritic and contagious skin condition caused by Sarcoptes scabiei var cameli. Thirteen camels (camelis dromedaries), showing progressive infection with S.scabiei mites in skin scrapings, were chosen randomly from different affected herds at AL-Assa camel project. Ten camels were treated with ivermectin (22,23-dihydroavermectin B1, Ivomec, Merck) at a dose rate of 0.2 mg./kg.body weight. Scratching and rubbing had completely disappeared in the treated camels one week after the second injection. Two weeks after the second injection motile mites were found on only one camel, and three weeks after the second injection, no motile mites were detected. Motile mites were observed in the three untreated camels up to the end of the trial.

Keywords: ivermecti, Sarcoptes scabiei, camels, scrapings

Procedia PDF Downloads 508
17649 Reference Model for the Implementation of an E-Commerce Solution in Peruvian SMEs in the Retail Sector

Authors: Julio Kauss, Miguel Cadillo, David Mauricio

Abstract:

E-commerce is a business model that allows companies to optimize the processes of buying, selling, transferring goods and exchanging services through computer networks or the Internet. In Peru, the electronic commerce is used infrequently. This situation is due, in part to the fact that there is no model that allows companies to implement an e-commerce solution, which means that most SMEs do not have adequate knowledge to adapt to electronic commerce. In this work, a reference model is proposed for the implementation of an e-commerce solution in Peruvian SMEs in the retail sector. It consists of five phases: Business Analysis, Business Modeling, Implementation, Post Implementation and Results. The present model was validated in a SME of the Peruvian retail sector through the implementation of an electronic commerce platform, through which the company increased its sales through the delivery channel by 10% in the first month of deployment. This result showed that the model is easy to implement, is economical and agile. In addition, it allowed the company to increase its business offer, adapt to e-commerce and improve customer loyalty.

Keywords: e-commerce, retail, SMEs, reference model

Procedia PDF Downloads 320
17648 Experimental Modal Analysis of Kursuncular Minaret

Authors: Yunus Dere

Abstract:

Minarets are tower like structures where the call to prayer of Muslims is performed. They have a symbolic meaning and sacred place among Muslims. Being tall and slender, they are prone to damage under earthquakes and strong winds. Kursuncular stone minaret was built around thirty years ago in Konya/TURKEY. Its core and helical stairs are made of reinforced concrete. Its stone spire was damaged during a light earthquake. Its spire is later replaced with a light material covered with lead sheets. In this study, the natural frequencies and mode shapes of Kursuncular minaret is obtained experimentally and analytically. First an ambient vibration test is carried out using a data acquisition system with accelerometers located at four locations along the height of the minaret. The collected vibration data is evaluated by operational modal analysis techniques. For the analytical part of the study, the dimensions of the minaret are accurately measured and a detailed 3D solid finite element model of the minaret is generated. The moduli of elasticity of the stone and concrete are approximated using the compressive strengths obtained by Windsor Pin tests. Finite element modal analysis of the minaret is carried out to get the modal parameters. Experimental and analytical results are then compared and found in good agreement.

Keywords: experimental modal analysis, stone minaret, finite element modal analysis, minarets

Procedia PDF Downloads 327
17647 Kinetic Façade Design Using 3D Scanning to Convert Physical Models into Digital Models

Authors: Do-Jin Jang, Sung-Ah Kim

Abstract:

In designing a kinetic façade, it is hard for the designer to make digital models due to its complex geometry with motion. This paper aims to present a methodology of converting a point cloud of a physical model into a single digital model with a certain topology and motion. The method uses a Microsoft Kinect sensor, and color markers were defined and applied to three paper folding-inspired designs. Although the resulted digital model cannot represent the whole folding range of the physical model, the method supports the designer to conduct a performance-oriented design process with the rough physical model in the reduced folding range.

Keywords: design media, kinetic facades, tangible user interface, 3D scanning

Procedia PDF Downloads 413
17646 A Large Language Model-Driven Method for Automated Building Energy Model Generation

Authors: Yake Zhang, Peng Xu

Abstract:

The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.

Keywords: artificial intelligence, building energy modelling, building simulation, large language model

Procedia PDF Downloads 26
17645 An Improved Model of Estimation Global Solar Irradiation from in situ Data: Case of Oran Algeria Region

Authors: Houcine Naim, Abdelatif Hassini, Noureddine Benabadji, Alex Van Den Bossche

Abstract:

In this paper, two models to estimate the overall monthly average daily radiation on a horizontal surface were applied to the site of Oran (35.38 ° N, 0.37 °W). We present a comparison between the first one is a regression equation of the Angstrom type and the second model is developed by the present authors some modifications were suggested using as input parameters: the astronomical parameters as (latitude, longitude, and altitude) and meteorological parameters as (relative humidity). The comparisons are made using the mean bias error (MBE), root mean square error (RMSE), mean percentage error (MPE), and mean absolute bias error (MABE). This comparison shows that the second model is closer to the experimental values that the model of Angstrom.

Keywords: meteorology, global radiation, Angstrom model, Oran

Procedia PDF Downloads 233
17644 Expert Review on Conceptual Design Model of Assistive Courseware for Low Vision (AC4LV) Learners

Authors: Nurulnadwan Aziz, Ariffin Abdul Mutalib, Siti Mahfuzah Sarif

Abstract:

This paper reports an ongoing project regarding the development of Conceptual Design Model of Assistive Courseware for Low Vision (AC4LV) learners. Having developed the intended model, it has to be validated prior to producing it as guidance for the developers to develop an AC4LV. This study requires two phases of validation process which are through expert review and prototyping method. This paper presents a part of the validation process which is findings from experts review on Conceptual Design Model of AC4LV which has been carried out through a questionnaire. Results from 12 international and local experts from various respectable fields in Human-Computer Interaction (HCI) were discussed and justified. In a nutshell, reviewed Conceptual Design Model of AC4LV was formed. Future works of this study are to validate the reviewed model through prototyping method prior to testing it to the targeted users.

Keywords: assistive courseware, conceptual design model, expert review, low vision learners

Procedia PDF Downloads 546
17643 Electrochemical Study of Copper–Tin Alloy Nucleation Mechanisms onto Different Substrates

Authors: Meriem Hamla, Mohamed Benaicha, Sabrine Derbal

Abstract:

In the present work, several materials such as M/glass (M = Pt, Mo) were investigated to test their suitability for studying the early nucleation stages and growth of copper-tin clusters. It was found that most of these materials stand as good substrates to be used in the study of the nucleation and growth of electrodeposited Cu-Sn alloys from aqueous solution containing CuCl2, SnCl2 as electroactive species and Na3C6H5O7 as complexing agent. Among these substrates, Pt shows instantaneous models followed by 3D diffusion-limited growth. On the other hand, the electrodeposited copper-tin thin films onto Mo substrate followed progressive nucleation. The deposition mechanism of the Cu-Sn films has been studied using stationary electrochemical techniques (cyclic voltammetery (CV) and chronoamperometry (CA). The structural, morphological and compositional of characterization have been studied using X-ray diffraction (XRD), scanning electron microscopy (SEM) and EDAX techniques respectively.

Keywords: electrodeposition, CuSn, nucleation, mechanism

Procedia PDF Downloads 398
17642 Application of Response Surface Methodology in Optimizing Chitosan-Argan Nutshell Beads for Radioactive Wastewater Treatment

Authors: F. F. Zahra, E. G. Touria, Y. Samia, M. Ahmed, H. Hasna, B. M. Latifa

Abstract:

The presence of radioactive contaminants in wastewater poses a significant environmental and health risk, necessitating effective treatment solutions. This study investigates the optimization of chitosan-Argan nutshell beads for the removal of radioactive elements from wastewater, utilizing Response Surface Methodology (RSM) to enhance the treatment efficiency. Chitosan, known for its biocompatibility and adsorption properties, was combined with Argan nutshell powder to form composite beads. These beads were then evaluated for their capacity to remove radioactive contaminants from synthetic wastewater. The Box-Behnken design (BBD) under RSM was employed to analyze the influence of key operational parameters, including initial contaminant concentration, pH, bead dosage, and contact time, on the removal efficiency. Experimental results indicated that all tested parameters significantly affected the removal efficiency, with initial contaminant concentration and pH showing the most substantial impact. The optimized conditions, as determined by RSM, were found to be an initial contaminant concentration of 50 mg/L, a pH of 6, a bead dosage of 0.5 g/L, and a contact time of 120 minutes. Under these conditions, the removal efficiency reached up to 95%, demonstrating the potential of chitosan-Argan nutshell beads as a viable solution for radioactive wastewater treatment. Furthermore, the adsorption process was characterized by fitting the experimental data to various isotherm and kinetic models. The adsorption isotherms conformed well to the Langmuir model, indicating monolayer adsorption, while the kinetic data were best described by the pseudo-second-order model, suggesting chemisorption as the primary mechanism. This study highlights the efficacy of chitosan-Argan nutshell beads in removing radioactive contaminants from wastewater and underscores the importance of optimizing treatment parameters using RSM. The findings provide a foundation for developing cost-effective and environmentally friendly treatment technologies for radioactive wastewater.

Keywords: adsorption, argan nutshell, beads, chitosan, mechanism, optimization, radioactive wastewater, response surface methodology

Procedia PDF Downloads 32
17641 Application of Model Tree in the Prediction of TBM Rate of Penetration with Synthetic Minority Oversampling Technique

Authors: Ehsan Mehryaar

Abstract:

The rate of penetration is (RoP) one of the vital factors in the cost and time of tunnel boring projects; therefore, predicting it can lead to a substantial increase in the efficiency of the project. RoP is heavily dependent geological properties of the project site and TBM properties. In this study, 151-point data from Queen’s water tunnel is collected, which includes unconfined compression strength, peak slope index, angle with weak planes, and distance between planes of weaknesses. Since the size of the data is small, it was observed that it is imbalanced. To solve that problem synthetic minority oversampling technique is utilized. The model based on the model tree is proposed, where each leaf consists of a support vector machine model. Proposed model performance is then compared to existing empirical equations in the literature.

Keywords: Model tree, SMOTE, rate of penetration, TBM(tunnel boring machine), SVM

Procedia PDF Downloads 174
17640 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing

Authors: Tolulope Aremu

Abstract:

This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.

Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving

Procedia PDF Downloads 31
17639 An Agent-Based Modeling and Simulation of Human Muscle

Authors: Sina Saadati, Mohammadreza Razzazi

Abstract:

In this article, we have tried to present an agent-based model of human muscle. A suitable model of muscle is necessary for the analysis of mankind's movements. It can be used by clinical researchers who study the influence of motion sicknesses, like Parkinson's disease. It is also useful in the development of a prosthesis that receives the electromyography signals and generates force as a reaction. Since we have focused on computational efficiency in this research, the model can compute the calculations very fast. As far as it concerns prostheses, the model can be known as a charge-efficient method. In this paper, we are about to illustrate an agent-based model. Then, we will use it to simulate the human gait cycle. This method can also be done reversely in the analysis of gait in motion sicknesses.

Keywords: agent-based modeling and simulation, human muscle, gait cycle, motion sickness

Procedia PDF Downloads 114
17638 Seismic Impact and Design on Buried Pipelines

Authors: T. Schmitt, J. Rosin, C. Butenweg

Abstract:

Seismic design of buried pipeline systems for energy and water supply is not only important for plant and operational safety, but in particular for the maintenance of supply infrastructure after an earthquake. Past earthquakes have shown the vulnerability of pipeline systems. After the Kobe earthquake in Japan in 1995 for instance, in some regions the water supply was interrupted for almost two months. The present paper shows special issues of the seismic wave impacts on buried pipelines, describes calculation methods, proposes approaches and gives calculation examples. Buried pipelines are exposed to different effects of seismic impacts. This paper regards the effects of transient displacement differences and resulting tensions within the pipeline due to the wave propagation of the earthquake. Other effects are permanent displacements due to fault rupture displacements at the surface, soil liquefaction, landslides and seismic soil compaction. The presented model can also be used to calculate fault rupture induced displacements. Based on a three-dimensional Finite Element Model parameter studies are performed to show the influence of several parameters such as incoming wave angle, wave velocity, soil depth and selected displacement time histories. In the computer model, the interaction between the pipeline and the surrounding soil is modeled with non-linear soil springs. A propagating wave is simulated affecting the pipeline punctually independently in time and space. The resulting stresses mainly are caused by displacement differences of neighboring pipeline segments and by soil-structure interaction. The calculation examples focus on pipeline bends as the most critical parts. Special attention is given to the calculation of long-distance heat pipeline systems. Here, in regular distances expansion bends are arranged to ensure movements of the pipeline due to high temperature. Such expansion bends are usually designed with small bending radii, which in the event of an earthquake lead to high bending stresses at the cross-section of the pipeline. Therefore, Karman's elasticity factors, as well as the stress intensity factors for curved pipe sections, must be taken into account. The seismic verification of the pipeline for wave propagation in the soil can be achieved by observing normative strain criteria. Finally, an interpretation of the results and recommendations are given taking into account the most critical parameters.

Keywords: buried pipeline, earthquake, seismic impact, transient displacement

Procedia PDF Downloads 187
17637 Towards Computational Fluid Dynamics Based Methodology to Accelerate Bioprocess Scale Up and Scale Down

Authors: Vishal Kumar Singh

Abstract:

Bioprocess development is a time-constrained activity aimed at harnessing the full potential of culture performance in an ambience that is not natural to cells. Even with the use of chemically defined media and feeds, a significant amount of time is devoted in identifying the apt operating parameters. In addition, the scale-up of these processes is often accompanied by loss of antibody titer and product quality, which further delays the commercialization of the drug product. In such a scenario, the investigation of this disparity of culture performance is done by further experimentation at a smaller scale that is representative of at-scale production bioreactors. These scale-down model developments are also time-intensive. In this study, a computation fluid dynamics-based multi-objective scaling approach has been illustrated to speed up the process transfer. For the implementation of this approach, a transient multiphase water-air system has been studied in Ansys CFX to visualize the air bubble distribution and volumetric mass transfer coefficient (kLa) profiles, followed by the design of experiment based parametric optimization approach to define the operational space. The proposed approach is completely in silico and requires minimum experimentation, thereby rendering a high throughput to the overall process development.

Keywords: bioprocess development, scale up, scale down, computation fluid dynamics, multi-objective, Ansys CFX, design of experiment

Procedia PDF Downloads 82
17636 Indigenous Understandings of Climate Vulnerability in Chile: A Qualitative Approach

Authors: Rosario Carmona

Abstract:

This article aims to discuss the importance of indigenous people participation in climate change mitigation and adaptation. Specifically, it analyses different understandings of climate vulnerability among diverse actors involved in climate change policies in Chile: indigenous people, state officials, and academics. These data were collected through participant observation and interviews conducted during October 2017 and January 2019 in Chile. Following Karen O’Brien, there are two types of vulnerability, outcome vulnerability and contextual vulnerability. How vulnerability to climate change is understood determines the approach, which actors are involved and which knowledge is considered to address it. Because climate change is a very complex phenomenon, it is necessary to transform the institutions and their responses. To do so, it is fundamental to consider these two perspectives and different types of knowledge, particularly those of the most vulnerable, such as indigenous people. For centuries and thanks to a long coexistence with the environment, indigenous societies have elaborated coping strategies, and some of them are already adapting to climate change. Indigenous people from Chile are not an exception. But, indigenous people tend to be excluded from decision-making processes. And indigenous knowledge is frequently seen as subjective and arbitrary in relation to science. Nevertheless, last years indigenous knowledge has gained particular relevance in the academic world, and indigenous actors are getting prominence in international negotiations. There are some mechanisms that promote their participation (e.g., Cancun safeguards, World Bank operational policies, REDD+), which are not absent from difficulties. And since 2016 parties are working on a Local Communities and Indigenous Peoples Platform. This paper also explores the incidence of this process in Chile. Although there is progress in the participation of indigenous people, this participation responds to the operational policies of the funding agencies and not to a real commitment of the state with this sector. The State of Chile omits a review of the structure that promotes inequality and the exclusion of indigenous people. In this way, climate change policies could be configured as a new mechanism of coloniality that validates a single type of knowledge and leads to new territorial control strategies, which increases vulnerability.

Keywords: indigenous knowledge, climate change, vulnerability, Chile

Procedia PDF Downloads 126
17635 Multinomial Dirichlet Gaussian Process Model for Classification of Multidimensional Data

Authors: Wanhyun Cho, Soonja Kang, Sanggoon Kim, Soonyoung Park

Abstract:

We present probabilistic multinomial Dirichlet classification model for multidimensional data and Gaussian process priors. Here, we have considered an efficient computational method that can be used to obtain the approximate posteriors for latent variables and parameters needed to define the multiclass Gaussian process classification model. We first investigated the process of inducing a posterior distribution for various parameters and latent function by using the variational Bayesian approximations and important sampling method, and next we derived a predictive distribution of latent function needed to classify new samples. The proposed model is applied to classify the synthetic multivariate dataset in order to verify the performance of our model. Experiment result shows that our model is more accurate than the other approximation methods.

Keywords: multinomial dirichlet classification model, Gaussian process priors, variational Bayesian approximation, importance sampling, approximate posterior distribution, marginal likelihood evidence

Procedia PDF Downloads 444
17634 The Use of Haar Wavelet Mother Signal Tool for Performance Analysis Response of Distillation Column (Application to Moroccan Case Study)

Authors: Mahacine Amrani

Abstract:

This paper aims at reviewing some Moroccan industrial applications of wavelet especially in the dynamic identification of a process model using Haar wavelet mother response. Two recent Moroccan study cases are described using dynamic data originated by a distillation column and an industrial polyethylene process plant. The purpose of the wavelet scheme is to build on-line dynamic models. In both case studies, a comparison is carried out between the Haar wavelet mother response model and a linear difference equation model. Finally it concludes, on the base of the comparison of the process performances and the best responses, which may be useful to create an estimated on-line internal model control and its application towards model-predictive controllers (MPC). All calculations were implemented using AutoSignal Software.

Keywords: process performance, model, wavelets, Haar, Moroccan

Procedia PDF Downloads 317
17633 Health Hazards of Performance Enhancing Drugs

Authors: Austin Oduor Otieno

Abstract:

There is an ingrained belief that the use of performance-enhancing drugs by athletes enable them to perform better. While this has been found to be truth, it also raises ethical and health issues. This paper analyzes the health hazards associated with performance enhancing drugs. It seeks to achieve this through the analysis of different academic journals as well as publications on the relationship between doping in sports and health. It concludes that there are inherent health hazards associated with the use of performance-enhancing drugs as they affect the physical and psychological health and wellbeing of a user (athlete).

Keywords: doping, health hazards, athletes, drugs

Procedia PDF Downloads 164