Search results for: process modeling advancements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18601

Search results for: process modeling advancements

18031 Electron Beam Melting Process Parameter Optimization Using Multi Objective Reinforcement Learning

Authors: Michael A. Sprayberry, Vincent C. Paquit

Abstract:

Process parameter optimization in metal powder bed electron beam melting (MPBEBM) is crucial to ensure the technology's repeatability, control, and industry-continued adoption. Despite continued efforts to address the challenges via the traditional design of experiments and process mapping techniques, there needs to be more successful in an on-the-fly optimization framework that can be adapted to MPBEBM systems. Additionally, data-intensive physics-based modeling and simulation methods are difficult to support by a metal AM alloy or system due to cost restrictions. To mitigate the challenge of resource-intensive experiments and models, this paper introduces a Multi-Objective Reinforcement Learning (MORL) methodology defined as an optimization problem for MPBEBM. An off-policy MORL framework based on policy gradient is proposed to discover optimal sets of beam power (P) – beam velocity (v) combinations to maintain a steady-state melt pool depth and phase transformation. For this, an experimentally validated Eagar-Tsai melt pool model is used to simulate the MPBEBM environment, where the beam acts as the agent across the P – v space to maximize returns for the uncertain powder bed environment producing a melt pool and phase transformation closer to the optimum. The culmination of the training process yields a set of process parameters {power, speed, hatch spacing, layer depth, and preheat} where the state (P,v) with the highest returns corresponds to a refined process parameter mapping. The resultant objects and mapping of returns to the P-v space show convergence with experimental observations. The framework, therefore, provides a model-free multi-objective approach to discovery without the need for trial-and-error experiments.

Keywords: additive manufacturing, metal powder bed fusion, reinforcement learning, process parameter optimization

Procedia PDF Downloads 90
18030 Modeling the International Economic Relations Development: The Prospects for Regional and Global Economic Integration

Authors: M. G. Shilina

Abstract:

The interstate economic interaction phenomenon is complex. ‘Economic integration’, as one of its types, can be explored through the prism of international law, the theories of the world economy, politics and international relations. The most objective study of the phenomenon requires a comprehensive multifactoral approach. In new geopolitical realities, the problems of coexistence and possible interconnection of various mechanisms of interstate economic interaction are actively discussed. Currently, the Eurasian continent states support the direction to economic integration. At the same time, the existing international economic law fragmentation in Eurasia is seen as the important problem. The Eurasian space is characterized by a various types of interstate relations: international agreements (multilateral and bilateral), and a large number of cooperation formats (from discussion platforms to organizations aimed at deep integration). For their harmonization, it is necessary to have a clear vision to the phased international economic relations regulation options. In the conditions of rapid development of international economic relations, the modeling (including prognostic) can be optimally used as the main scientific method for presenting the phenomenon. On the basis of this method, it is possible to form the current situation vision and the best options for further action. In order to determine the most objective version of the integration development, the combination of several approaches were used. The normative legal approach- the descriptive method of legal modeling- was taken as the basis for the analysis. A set of legal methods was supplemented by the international relations science prognostic methods. The key elements of the model are the international economic organizations and states' associations existing in the Eurasian space (the Eurasian Economic Union (EAEU), the European Union (EU), the Shanghai Cooperation Organization (SCO), Chinese project ‘One belt-one road’ (OBOR), the Commonwealth of Independent States (CIS), BRICS, etc.). A general term for the elements of the model is proposed - the interstate interaction mechanisms (IIM). The aim of building a model of current and future Eurasian economic integration is to show optimal options for joint economic development of the states and IIMs. The long-term goal of this development is the new economic and political space, so-called the ‘Great Eurasian Community’. The process of achievement this long-term goal consists of successive steps. Modeling the integration architecture and dividing the interaction into stages led us to the following conclusion: the SCO is able to transform Eurasia into a single economic space. Gradual implementation of the complex phased model, in which the SCO+ plays a key role, will allow building an effective economic integration for all its participants, to create an economically strong community. The model can have practical value for politicians, lawyers, economists and other participants involved in the economic integration process. A clear, systematic structure can serve as a basis for further governmental action.

Keywords: economic integration, The Eurasian Economic Union, The European Union, The Shanghai Cooperation Organization, The Silk Road Economic Belt

Procedia PDF Downloads 150
18029 Towards Automatic Calibration of In-Line Machine Processes

Authors: David F. Nettleton, Elodie Bugnicourt, Christian Wasiak, Alejandro Rosales

Abstract:

In this presentation, preliminary results are given for the modeling and calibration of two different industrial winding MIMO (Multiple Input Multiple Output) processes using machine learning techniques. In contrast to previous approaches which have typically used ‘black-box’ linear statistical methods together with a definition of the mechanical behavior of the process, we use non-linear machine learning algorithms together with a ‘white-box’ rule induction technique to create a supervised model of the fitting error between the expected and real force measures. The final objective is to build a precise model of the winding process in order to control de-tension of the material being wound in the first case, and the friction of the material passing through the die, in the second case. Case 1, Tension Control of a Winding Process. A plastic web is unwound from a first reel, goes over a traction reel and is rewound on a third reel. The objectives are: (i) to train a model to predict the web tension and (ii) calibration to find the input values which result in a given tension. Case 2, Friction Force Control of a Micro-Pullwinding Process. A core+resin passes through a first die, then two winding units wind an outer layer around the core, and a final pass through a second die. The objectives are: (i) to train a model to predict the friction on die2; (ii) calibration to find the input values which result in a given friction on die2. Different machine learning approaches are tested to build models, Kernel Ridge Regression, Support Vector Regression (with a Radial Basis Function Kernel) and MPART (Rule Induction with continuous value as output). As a previous step, the MPART rule induction algorithm was used to build an explicative model of the error (the difference between expected and real friction on die2). The modeling of the error behavior using explicative rules is used to help improve the overall process model. Once the models are built, the inputs are calibrated by generating Gaussian random numbers for each input (taking into account its mean and standard deviation) and comparing the output to a target (desired) output until a closest fit is found. The results of empirical testing show that a high precision is obtained for the trained models and for the calibration process. The learning step is the slowest part of the process (max. 5 minutes for this data), but this can be done offline just once. The calibration step is much faster and in under one minute obtained a precision error of less than 1x10-3 for both outputs. To summarize, in the present work two processes have been modeled and calibrated. A fast processing time and high precision has been achieved, which can be further improved by using heuristics to guide the Gaussian calibration. Error behavior has been modeled to help improve the overall process understanding. This has relevance for the quick optimal set up of many different industrial processes which use a pull-winding type process to manufacture fibre reinforced plastic parts. Acknowledgements to the Openmind project which is funded by Horizon 2020 European Union funding for Research & Innovation, Grant Agreement number 680820

Keywords: data model, machine learning, industrial winding, calibration

Procedia PDF Downloads 241
18028 Sustainable Engineering: Synergy of BIM and Environmental Assessment Tools in Hong Kong Construction Industry

Authors: Kwok Tak Kit

Abstract:

The construction industry plays an important role in environmental and carbon emissions as it consumes a huge amount of natural resources and energy. Sustainable engineering involves the process of planning, design, procurement, construction and delivery in which the whole building and construction process resulting from building and construction can be effectively and sustainability managed to achieve the use of natural resources. Implementation of sustainable technology development and innovation, adoption of the advanced construction process and facilitate the facilities management to implement the energy and waste control more accurately and effectively. Study and research in the relationship of BIM and environment assessment tools lack a clear discussion. In this paper, we will focus on the synergy of BIM technology and sustainable engineering in the AEC industry and outline the key factors which enhance the use of advanced innovation, technology and method and define the role of stakeholders to achieve zero-carbon emission toward the Paris Agreement to limit global warming to well below 2ᵒC above pre-industrial levels. A case study of the adoption of Building Information Modeling (BIM) and environmental assessment tools in Hong Kong will be discussed in this paper.

Keywords: sustainability, sustainable engineering, BIM, LEED

Procedia PDF Downloads 150
18027 Investigation of Dry Ice Mixed Novel Hybrid Lubri-Coolant in Sustainable Machining of Ti-6AL-4V Alloy: A Comparison of Experimental and Modelling

Authors: Muhammad Jamil, Ning He, Aqib Mashood Khan, Munish Kumar Gupta

Abstract:

Ti-6Al-4V has numerous applications in the medical, automobile, and aerospace industries due to corrosion resistivity, structural stability, and chemical inertness to most fluids at room temperature. These peculiar characteristics are beneficial for their application and present formidable challenges during machining. Machining of Ti-6Al-4V produces an elevated cutting temperature above 1000oC at dry conditions. This accelerates tool wear and reduces product quality. Therefore, there is always a need to employ sustainable/effective coolant/lubricant when machining such alloy. In this study, Finite Element Modeling (FEM) and experimental analysis when cutting Ti-6Al-4V under a distinctly developed dry ice mixed hybrid lubri-coolant are presented. This study aims to model the milling process of Ti-6Al-4V under a proposed novel hybrid lubri-coolant using different cutting speeds and feed per tooth DEFORM® software package was used to conduct the FEM and the numerical model was experimentally validated. A comparison of experimental and simulation results showed a maximum error of no more than 6% for all experimental conditions. In a nutshell, it can be said that the proposed model is effective in predicting the machining temperature precisely.

Keywords: friction coefficient, heat transfer, finite element modeling (FEM), milling Ti-6Al-4V

Procedia PDF Downloads 58
18026 Mitigation of Risk Management Activities towards Accountability into Microfinance Environment: Malaysian Case Study

Authors: Nor Azlina A. Rahman, Jamaliah Said, Salwana Hassan

Abstract:

Prompt changes in global business environment, such as passionate competition, managerial/operational, changing governmental regulation and innovation in technology have significant impacts on the organizations. At present, global business environment demands for more proactive institutions on microfinance to provide an opportunity for the business success. Microfinance providers in Malaysia still accelerate its activities of funding by cash and cheque. These institutions are at high risk as the paper-based system is deemed to be slow and prone to human error, as well as requiring a major annual reconciliation process. The global transformation of financial services, growing involvement of technology, innovation and new business activities had progressively made risk management profile to be more subjective and diversified. The persistent, complex and dynamic nature of risk management activities in the institutions arise due to highly automated advancements of technology. This may thus manifest in a variety of ways throughout the financial services sector. This study seeks out to examine current operational risks management being experienced by microfinance providers in Malaysia; investigate the process of current practices on facilitator control factor mechanisms, and explore how the adoption of technology, innovation and use of management accounting practices would affect the risk management process of operation system in microfinance providers in Malaysia. A case study method was employed in this study. The case study also need to find that the vital past role of management accounting will be used for mitigation of risk management activities towards accountability as an information or guideline to microfinance provider. An empirical element obtainable with qualitative method is needed in this study, where multipart and in-depth information are essential to understand the issues of these institution phenomena. This study is expected to propose a theoretical model for implementation of technology, innovation and management accounting practices into the system of operation to improve internal control and subsequently lead to mitigation of risk management activities among microfinance providers to be more successful.

Keywords: microfinance, accountability, operational risks, management accounting practices

Procedia PDF Downloads 438
18025 A Case Study of Conceptual Framework for Process Performance

Authors: Ljubica Milanović Glavan, Vesna Bosilj Vukšić, Dalia Suša

Abstract:

In order to gain a competitive advantage, many companies are focusing on reorganization of their business processes and implementing process-based management. In this context, assessing process performance is essential because it enables individuals and groups to assess where they stand in comparison to their competitors. In this paper, it is argued that process performance measurement is a necessity for a modern process-oriented company and it should be supported by a holistic process performance measurement system. It seems very unlikely that a universal set of performance indicators can be applied successfully to all business processes. Thus, performance indicators must be process-specific and have to be derived from both the strategic enterprise-wide goals and the process goals. Based on the extensive literature review and interviews conducted in Croatian company a conceptual framework for process performance measurement system was developed. The main objective of such system is to help process managers by providing comprehensive and timely information on the performance of business processes. This information can be used to communicate goals and current performance of a business process directly to the process team, to improve resource allocation and process output regarding quantity and quality, to give early warning signals, to make a diagnosis of the weaknesses of a business process, to decide whether corrective actions are needed and to assess the impact of actions taken.

Keywords: Croatia, key performance indicators, performance measurement, process performance

Procedia PDF Downloads 673
18024 Modeling Aerosol Formation in an Electrically Heated Tobacco Product

Authors: Markus Nordlund, Arkadiusz K. Kuczaj

Abstract:

Philip Morris International (PMI) is developing a range of novel tobacco products with the potential to reduce individual risk and population harm in comparison to smoking cigarettes. One of these products is the Tobacco Heating System 2.2 (THS 2.2), (named as the Electrically Heated Tobacco System (EHTS) in this paper), already commercialized in a number of countries (e.g., Japan, Italy, Switzerland, Russia, Portugal and Romania). During use, the patented EHTS heats a specifically designed tobacco product (Electrically Heated Tobacco Product (EHTP)) when inserted into a Holder (heating device). The EHTP contains tobacco material in the form of a porous plug that undergoes a controlled heating process to release chemical compounds into vapors, from which an aerosol is formed during cooling. The aim of this work was to investigate the aerosol formation characteristics for realistic operating conditions of the EHTS as well as for relevant gas mixture compositions measured in the EHTP aerosol consisting mostly of water, glycerol and nicotine, but also other compounds at much lower concentrations. The nucleation process taking place in the EHTP during use when operated in the Holder has therefore been modeled numerically using an extended Classical Nucleation Theory (CNT) for multicomponent gas mixtures. Results from the performed simulations demonstrate that aerosol droplets are formed only in the presence of an aerosol former being mainly glycerol. Minor compounds in the gas mixture were not able to reach a supersaturated state alone and therefore could not generate aerosol droplets from the multicomponent gas mixture at the operating conditions simulated. For the analytically characterized aerosol composition and estimated operating conditions of the EHTS and EHTP, glycerol was shown to be the main aerosol former triggering the nucleation process in the EHTP. This implies that according to the CNT, an aerosol former, such as glycerol needs to be present in the gas mixture for an aerosol to form under the tested operating conditions. To assess if these conclusions are sensitive to the initial amount of the minor compounds and to include and represent the total mass of the aerosol collected during the analytical aerosol characterization, simulations were carried out with initial masses of the minor compounds increased by as much as a factor of 500. Despite this extreme condition, no aerosol droplets were generated when glycerol, nicotine and water were treated as inert species and therefore not actively contributing to the nucleation process. This implies that according to the CNT, an aerosol cannot be generated without the help of an aerosol former, from the multicomponent gas mixtures at the compositions and operating conditions estimated for the EHTP, even if all minor compounds are released or generated in a single puff.

Keywords: aerosol, classical nucleation theory (CNT), electrically heated tobacco product (EHTP), electrically heated tobacco system (EHTS), modeling, multicomponent, nucleation

Procedia PDF Downloads 277
18023 Removal of Hexavalent Chromium from Aqueous Solutions by Biosorption Using Macadamia Nutshells: Effect of Different Treatment Methods

Authors: Vusumzi E. Pakade, Themba D. Ntuli, Augustine E. Ofomaja

Abstract:

Macadamia nutshell biosorbents treated in three different methods (raw Macadamia nutshell powder (RMN), acid-treated Macadamia nutshell (ATMN) and base-treated Macadamia nutshell (BTMN)) were investigated for the adsorption of Cr(VI) from aqueous solutions. Fourier transform infrared spectroscopy (FT-IR) spectra of free and Cr(VI)-loaded sorbents as well as thermogravimetric analysis (TGA) revealed that the acid and base treatments modified the surface properties of the sorbents. The optimum conditions for the adsorption of Cr(VI) by sorbents were pH 2, contact time 10 h, adsorbent dosage 0.2 g L-1, and concentration 100 mg L-1. The different treatment methods altered the surface characteristics of the sorbents and produced different maximum binding capacities of 42.5, 40.6 and 37.5 mg g-1 for RMN, ATMN and BTMN, respectively. The data was fitted into the Langmuir, Freundlich, Redlich-Peterson and Sips isotherms. No single model could clearly explain the data perhaps due to the complexity of process taking place. The kinetic modeling results showed that the process of Cr(VI) biosorption with Macadamia sorbents was better described by a process of chemical sorption in pseudo-second order. These results showed that the three treatment methods yielded different surface properties which then influenced adsorption of Cr(VI) differently.

Keywords: biosorption, chromium(VI), isotherms, Macadamia, reduction, treatment

Procedia PDF Downloads 266
18022 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 132
18021 Building Information Modeling Implementation for Managing an Extra Large Governmental Building Renovation Project

Authors: Pornpote Nusen, Manop Kaewmoracharoen

Abstract:

In recent years, there was an observable shift in fully developed countries from constructing new buildings to modifying existing buildings. The issue was that although an effective instrument like BIM (Building Information Modeling) was well developed for constructing new buildings, it was not widely used to renovate old buildings. BIM was accepted as an effective means to overcome common managerial problems such as project delay, cost overrun, and poor quality of the project life cycle. It was recently introduced in Thailand and rarely used in a renovation project. Today, in Thailand, BIM is mostly used for creating aesthetic 3D models and quantity takeoff purposes, though it can be an effective tool to use as a project management tool in planning and scheduling. Now the governmental sector in Thailand begins to recognize the uses of using BIM to manage a construction project, but the knowledge about the BIM implementation to governmental construction projects is underdeveloped. Further studies need to be conducted to maximize its advantages for the governmental sector. An educational extra large governmental building of 17,000 square-meters was used in this research. It is currently under construction for a two-year renovation project. BIM models of the building for the exterior and interior areas were created for the whole five floors. Then 4D BIM with combination of 3D BIM plus time was created for planning and scheduling. Three focus groups had been done with executive committee, contractors, and officers of the building to discuss the possibility of usage and usefulness of BIM approach over the traditional process. Several aspects were discussed in the positive sides, especially several foreseen problems, such as the inadequate accessibility of ways, the altered ceiling levels, the impractical construction plan created through a traditional approach, and the lack of constructability information. However, for some parties, the cost of BIM implementation was a concern, though, this study believes, its uses outweigh the cost.

Keywords: building information modeling, extra large building, governmental building renovation, project management, renovation, 4D BIM

Procedia PDF Downloads 153
18020 A Hierarchical Bayesian Calibration of Data-Driven Models for Composite Laminate Consolidation

Authors: Nikolaos Papadimas, Joanna Bennett, Amir Sakhaei, Timothy Dodwell

Abstract:

Composite modeling of consolidation processes is playing an important role in the process and part design by indicating the formation of possible unwanted prior to expensive experimental iterative trial and development programs. Composite materials in their uncured state display complex constitutive behavior, which has received much academic interest, and this with different models proposed. Errors from modeling and statistical which arise from this fitting will propagate through any simulation in which the material model is used. A general hyperelastic polynomial representation was proposed, which can be readily implemented in various nonlinear finite element packages. In our case, FEniCS was chosen. The coefficients are assumed uncertain, and therefore the distribution of parameters learned using Markov Chain Monte Carlo (MCMC) methods. In engineering, the approach often followed is to select a single set of model parameters, which on average, best fits a set of experiments. There are good statistical reasons why this is not a rigorous approach to take. To overcome these challenges, A hierarchical Bayesian framework was proposed in which population distribution of model parameters is inferred from an ensemble of experiments tests. The resulting sampled distribution of hyperparameters is approximated using Maximum Entropy methods so that the distribution of samples can be readily sampled when embedded within a stochastic finite element simulation. The methodology is validated and demonstrated on a set of consolidation experiments of AS4/8852 with various stacking sequences. The resulting distributions are then applied to stochastic finite element simulations of the consolidation of curved parts, leading to a distribution of possible model outputs. With this, the paper, as far as the authors are aware, represents the first stochastic finite element implementation in composite process modelling.

Keywords: data-driven , material consolidation, stochastic finite elements, surrogate models

Procedia PDF Downloads 145
18019 Strongly Coupled Finite Element Formulation of Electromechanical Systems with Integrated Mesh Morphing Using Radial Basis Functions

Authors: David Kriebel, Jan Edgar Mehner

Abstract:

The paper introduces a method to efficiently simulate nonlinear changing electrostatic fields occurring in micro-electromechanical systems (MEMS). Large deflections of the capacitor electrodes usually introduce nonlinear electromechanical forces on the mechanical system. Traditional finite element methods require a time-consuming remeshing process to capture exact results for this physical domain interaction. In order to accelerate the simulation process and eliminate the remeshing process, a formulation of a strongly coupled electromechanical transducer element will be introduced, which uses a combination of finite-element with an advanced mesh morphing technique using radial basis functions (RBF). The RBF allows large geometrical changes of the electric field domain while retaining the high element quality of the deformed mesh. Coupling effects between mechanical and electrical domains are directly included within the element formulation. Fringing field effects are described accurately by using traditional arbitrary shape functions.

Keywords: electromechanical, electric field, transducer, simulation, modeling, finite-element, mesh morphing, radial basis function

Procedia PDF Downloads 242
18018 Modeling Continuous Flow in a Curved Channel Using Smoothed Particle Hydrodynamics

Authors: Indri Mahadiraka Rumamby, R. R. Dwinanti Rika Marthanty, Jessica Sjah

Abstract:

Smoothed particle hydrodynamics (SPH) was originally created to simulate nonaxisymmetric phenomena in astrophysics. However, this method still has several shortcomings, namely the high computational cost required to model values with high resolution and problems with boundary conditions. The difficulty of modeling boundary conditions occurs because the SPH method is influenced by particle deficiency due to the integral of the kernel function being truncated by boundary conditions. This research aims to answer if SPH modeling with a focus on boundary layer interactions and continuous flow can produce quantifiably accurate values with low computational cost. This research will combine algorithms and coding in the main program of meandering river, continuous flow algorithm, and solid-fluid algorithm with the aim of obtaining quantitatively accurate results on solid-fluid interactions with the continuous flow on a meandering channel using the SPH method. This study uses the Fortran programming language for modeling the SPH (Smoothed Particle Hydrodynamics) numerical method; the model is conducted in the form of a U-shaped meandering open channel in 3D, where the channel walls are soil particles and uses a continuous flow with a limited number of particles.

Keywords: smoothed particle hydrodynamics, computational fluid dynamics, numerical simulation, fluid mechanics

Procedia PDF Downloads 130
18017 Modeling Anisotropic Damage Algorithms of Metallic Structures

Authors: Bahar Ayhan

Abstract:

The present paper is concerned with the numerical modeling of the inelastic behavior of the anisotropically damaged ductile materials, which are based on a generalized macroscopic theory within the framework of continuum damage mechanics. Kinematic decomposition of the strain rates into elastic, plastic and damage parts is basis for accomplishing the structure of continuum theory. The evolution of the damage strain rate tensor is detailed with the consideration of anisotropic effects. Helmholtz free energy functions are constructed separately for the elastic and inelastic behaviors in order to be able to address the plastic and damage process. Additionally, the constitutive structure, which is based on the standard dissipative material approach, is elaborated with stress tensor, a yield criterion for plasticity and a fracture criterion for damage besides the potential functions of each inelastic phenomenon. The finite element method is used to approximate the linearized variational problem. Stress and strain outcomes are solved by using the numerical integration algorithm based on operator split methodology with a plastic and damage (multiplicator) variable separately. Numerical simulations are proposed in order to demonstrate the efficiency of the formulation by comparing the examples in the literature.

Keywords: anisotropic damage, finite element method, plasticity, coupling

Procedia PDF Downloads 206
18016 Handling Complexity of a Complex System Design: Paradigm, Formalism and Transformations

Authors: Hycham Aboutaleb, Bruno Monsuez

Abstract:

Current systems' complexity has reached a degree that requires addressing conception and design issues while taking into account environmental, operational, social, legal, and financial aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponentially growing effort, cost, and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework, and an environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and suggests a graph-based formalism for modeling complex systems. Finally, a set of transformations are defined to handle the model complexity.

Keywords: higraph-based, formalism, system engineering paradigm, modeling requirements, graph-based transformations

Procedia PDF Downloads 403
18015 Hypergraph for System of Systems modeling

Authors: Haffaf Hafid

Abstract:

Hypergraphs, after being used to model the structural organization of System of Sytems (SoS) at macroscopic level, has recent trends towards generalizing this powerful representation at different stages of complex system modelling. In this paper, we first describe different applications of hypergraph theory, and step by step, introduce multilevel modeling of SoS by means of integrating Constraint Programming Langages (CSP) dealing with engineering system reconfiguration strategy. As an application, we give an A.C.T Terminal controlled by a set of Intelligent Automated Vehicle.

Keywords: hypergraph model, structural analysis, bipartite graph, monitoring, system of systems, reconfiguration analysis, hypernetwork

Procedia PDF Downloads 488
18014 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects

Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang

Abstract:

As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.

Keywords: 4D, 5D, 6D, active BIM

Procedia PDF Downloads 276
18013 A Novel Algorithm for Parsing IFC Models

Authors: Raninder Kaur Dhillon, Mayur Jethwa, Hardeep Singh Rai

Abstract:

Information technology has made a pivotal progress across disparate disciplines, one of which is AEC (Architecture, Engineering and Construction) industry. CAD is a form of computer-aided building modulation that architects, engineers and contractors use to create and view two- and three-dimensional models. The AEC industry also uses building information modeling (BIM), a newer computerized modeling system that can create four-dimensional models; this software can greatly increase productivity in the AEC industry. BIM models generate open source IFC (Industry Foundation Classes) files which aim for interoperability for exchanging information throughout the project lifecycle among various disciplines. The methods developed in previous studies require either an IFC schema or MVD and software applications, such as an IFC model server or a Building Information Modeling (BIM) authoring tool, to extract a partial or complete IFC instance model. This paper proposes an efficient algorithm for extracting a partial and total model from an Industry Foundation Classes (IFC) instance model without an IFC schema or a complete IFC model view definition (MVD).

Keywords: BIM, CAD, IFC, MVD

Procedia PDF Downloads 300
18012 Hominin Niche in the Times of Climate Change

Authors: Emilia Hunt, Sally C. Reynolds, Fiona Coward, Fabio Parracho Silva, Philip Hopley

Abstract:

Ecological niche modeling is widely used in conservation studies, but application to the extinct hominin species is a relatively new approach. Being able to understand what ecological niches were occupied by respective hominin species provides a new perspective into influences on evolutionary processes. Niche separation or overlap can tell us more about specific requirements of the species within the given timeframe. Many of the ancestral species lived through enormous climate changes: glacial and interglacial periods, changes in rainfall, leading to desertification or flooding of regions and displayed impressive levels of adaptation necessary for their survival. This paper reviews niche modeling methodologies and their application to hominin studies. Traditional conservation methods might not be directly applicable to extinct species and are not comparable to hominins. Hominin niche also includes aspects of technologies, use of fire and extended communication, which are not traditionally used in building conservation models. Future perspectives on how to improve niche modeling for extinct hominin species will be discussed.

Keywords: hominin niche, climate change, evolution, adaptation, ecological niche modelling

Procedia PDF Downloads 189
18011 Assessing the Impact of Urbanization on Flood Risk: A Case Study

Authors: Talha Ahmed, Ishtiaq Hassan

Abstract:

Urban areas or metropolitan is portrayed by the very high density of population due to the result of these economic activities. Some critical elements, such as urban expansion and climate change, are driving changes in cities with exposure to the incidence and impacts of pluvial floods. Urban communities are recurrently developed by huge spaces by which water cannot enter impermeable surfaces, such as man-made permanent surfaces and structures, which do not cause the phenomena of infiltration and percolation. Urban sprawl can result in increased run-off volumes, flood stage and flood extents during heavy rainy seasons. The flood risks require a thorough examination of all aspects affecting to severe an event in order to accurately estimate their impacts and other risk factors associated with them. For risk evaluation and its impact due to urbanization, an integrated hydrological modeling approach is used on the study area in Islamabad (Pakistan), focusing on a natural water body that has been adopted in this research. The vulnerability of the physical elements at risk in the research region is analyzed using GIS and SOBEK. The supervised classification of land use containing the images from 1980 to 2020 is used. The modeling of DEM with selected return period is used for modeling a hydrodynamic model for flood event inundation. The selected return periods are 50,75 and 100 years which are used in flood modeling. The findings of this study provided useful information on high-risk places and at-risk properties.

Keywords: urbanization, flood, flood risk, GIS

Procedia PDF Downloads 175
18010 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 89
18009 Fixed Points of Contractive-Like Operators by a Faster Iterative Process

Authors: Safeer Hussain Khan

Abstract:

In this paper, we prove a strong convergence result using a recently introduced iterative process with contractive-like operators. This improves and generalizes corresponding results in the literature in two ways: the iterative process is faster, operators are more general. In the end, we indicate that the results can also be proved with the iterative process with error terms.

Keywords: contractive-like operator, iterative process, fixed point, strong convergence

Procedia PDF Downloads 433
18008 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation

Authors: Rizwan Rizwan

Abstract:

This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.

Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats

Procedia PDF Downloads 30
18007 Decision Making Communication in the Process of Technologies Commercialization: Archival Analysis of the Process Content

Authors: Vaida Zemlickiene

Abstract:

Scientists around the world and practitioners are working to identify the factors that influence the results of technology commercialization and to propose the ideal model for the technology commercialization process. In other words, all stakeholders of technology commercialization seek to find a formula or set of rules to succeed in commercializing technologies in order to avoid unproductive investments. In this article, the process of commercialization technology is understood as the process of transforming inventions into marketable products, services, and processes, or the path from the idea of using an invention to a product that incorporates process from 1 to 9 technology readiness level (TRL). There are many publications in the field of management literature, which are aimed at managing the commercialization process. However, there is an apparent lack of research for communication in decision-making in the process of technology commercialization. Works were done in the past, and the last decade's global research analysis led to the unambiguous conclusion that the methodological framework is not mature enough to be of practical use in business. The process of technology commercialization and the decisions made in the process should be explored in-depth. An archival analysis is performed to find insights into decision-making communication in the process of technologies commercialization, to find out the content of technology commercialization process: decision-making stages and participants, to analyze the internal factors of technology commercialization, to perform their critical analysis, to analyze the concept of successful/unsuccessful technology commercialization.

Keywords: the process of technology commercialization, communication in decision-making process, the content of technology commercialization process, successful/unsuccessful technology commercialization

Procedia PDF Downloads 153
18006 Outdoor Visible Light Communication Channel Modeling under Fog and Smoke Conditions

Authors: Véronique Georlette, Sebastien Bette, Sylvain Brohez, Nicolas Point, Veronique Moeyaert

Abstract:

Visible light communication (VLC) is a communication technology that is part of the optical wireless communication (OWC) family. It uses the visible and infrared spectrums to send data. For now, this technology has widely been studied for indoor use-cases, but it is sufficiently mature nowadays to consider the outdoor environment potentials. The main outdoor challenges are the meteorological conditions and the presence of smoke due to fire or pollutants in urban areas. This paper proposes a methodology to assess the robustness of an outdoor VLC system given the outdoor conditions. This methodology is put into practice in two realistic scenarios, a VLC bus stop, and a VLC streetlight. The methodology consists of computing the power margin available in the system, given all the characteristics of the VLC system and its surroundings. This is done thanks to an outdoor VLC communication channel simulator developed in Python. This simulator is able to quantify the effects of fog and smoke thanks to models taken from environmental and fire engineering scientific literature as well as the optical power reaching the receiver. These two phenomena impact the communication by increasing the total attenuation of the medium. The main conclusion drawn in this paper is that the levels of attenuation due to fog and smoke are in the same order of magnitude. The attenuation of fog being the highest under the visibility of 1 km. This gives a promising prospect for the deployment of outdoor VLC uses-cases in the near future.

Keywords: channel modeling, fog modeling, meteorological conditions, optical wireless communication, smoke modeling, visible light communication

Procedia PDF Downloads 150
18005 A Review of Current Trends in Grid Balancing Technologies

Authors: Kulkarni Rohini D.

Abstract:

While emerging as plausible sources of energy generation, new technologies, including photovoltaic (PV) solar panels, home battery energy storage systems, and electric vehicles (EVs), are exacerbating the operations of power distribution networks for distribution network operators (DNOs). Renewable energy production fluctuates, stemming in over- and under-generation energy, further complicating the issue of storing excess power and using it when necessary. Though renewable sources are non-exhausting and reoccurring, power storage of generated energy is almost as paramount as to its production process. Hence, to ensure smooth and efficient power storage at different levels, Grid balancing technologies are consequently the next theme to address in the sustainable space and growth sector. But, since hydrogen batteries were used in the earlier days to achieve this balance in power grids, new, recent advancements are more efficient and capable per unit of storage space while also being distinctive in terms of their underlying operating principles. The underlying technologies of "Flow batteries," "Gravity Solutions," and "Graphene Batteries" already have entered the market and are leading the race for efficient storage device solutions that will improve and stabilize Grid networks, followed by Grid balancing technologies.

Keywords: flow batteries, grid balancing, hydrogen batteries, power storage, solar

Procedia PDF Downloads 70
18004 An Extended Domain-Specific Modeling Language for Marine Observatory Relying on Enterprise Architecture

Authors: Charbel Aoun, Loic Lagadec

Abstract:

A Sensor Network (SN) is considered as an operation of two phases: (1) the observation/measuring, which means the accumulation of the gathered data at each sensor node; (2) transferring the collected data to some processing center (e.g., Fusion Servers) within the SN. Therefore, an underwater sensor network can be defined as a sensor network deployed underwater that monitors underwater activity. The deployed sensors, such as Hydrophones, are responsible for registering underwater activity and transferring it to more advanced components. The process of data exchange between the aforementioned components perfectly defines the Marine Observatory (MO) concept which provides information on ocean state, phenomena and processes. The first step towards the implementation of this concept is defining the environmental constraints and the required tools and components (Marine Cables, Smart Sensors, Data Fusion Server, etc). The logical and physical components that are used in these observatories perform some critical functions such as the localization of underwater moving objects. These functions can be orchestrated with other services (e.g. military or civilian reaction). In this paper, we present an extension to our MO meta-model that is used to generate a design tool (ArchiMO). We propose new constraints to be taken into consideration at design time. We illustrate our proposal with an example from the MO domain. Additionally, we generate the corresponding simulation code using our self-developed domain-specific model compiler. On the one hand, this illustrates our approach in relying on Enterprise Architecture (EA) framework that respects: multiple views, perspectives of stakeholders, and domain specificity. On the other hand, it helps reducing both complexity and time spent in design activity, while preventing from design modeling errors during porting this activity in the MO domain. As conclusion, this work aims to demonstrate that we can improve the design activity of complex system based on the use of MDE technologies and a domain-specific modeling language with the associated tooling. The major improvement is to provide an early validation step via models and simulation approach to consolidate the system design.

Keywords: smart sensors, data fusion, distributed fusion architecture, sensor networks, domain specific modeling language, enterprise architecture, underwater moving object, localization, marine observatory, NS-3, IMS

Procedia PDF Downloads 177
18003 Realistic Modeling of the Preclinical Small Animal Using Commercial Software

Authors: Su Chul Han, Seungwoo Park

Abstract:

As the increasing incidence of cancer, the technology and modality of radiotherapy have advanced and the importance of preclinical model is increasing in the cancer research. Furthermore, the small animal dosimetry is an essential part of the evaluation of the relationship between the absorbed dose in preclinical small animal and biological effect in preclinical study. In this study, we carried out realistic modeling of the preclinical small animal phantom possible to verify irradiated dose using commercial software. The small animal phantom was modeling from 4D Digital Mouse whole body phantom. To manipulate Moby phantom in commercial software (Mimics, Materialise, Leuven, Belgium), we converted Moby phantom to DICOM image file of CT by Matlab and two- dimensional of CT images were converted to the three-dimensional image and it is possible to segment and crop CT image in Sagittal, Coronal and axial view). The CT images of small animals were modeling following process. Based on the profile line value, the thresholding was carried out to make a mask that was connection of all the regions of the equal threshold range. Using thresholding method, we segmented into three part (bone, body (tissue). lung), to separate neighboring pixels between lung and body (tissue), we used region growing function of Mimics software. We acquired 3D object by 3D calculation in the segmented images. The generated 3D object was smoothing by remeshing operation and smoothing operation factor was 0.4, iteration value was 5. The edge mode was selected to perform triangle reduction. The parameters were that tolerance (0.1mm), edge angle (15 degrees) and the number of iteration (5). The image processing 3D object file was converted to an STL file to output with 3D printer. We modified 3D small animal file using 3- Matic research (Materialise, Leuven, Belgium) to make space for radiation dosimetry chips. We acquired 3D object of realistic small animal phantom. The width of small animal phantom was 2.631 cm, thickness was 2.361 cm, and length was 10.817. Mimics software supported efficiency about 3D object generation and usability of conversion to STL file for user. The development of small preclinical animal phantom would increase reliability of verification of absorbed dose in small animal for preclinical study.

Keywords: mimics, preclinical small animal, segmentation, 3D printer

Procedia PDF Downloads 366
18002 Effect of Plasticizer Additives on the Mechanical Properties of Cement Composite: A Molecular Dynamics Analysis

Authors: R. Mohan, V. Jadhav, A. Ahmed, J. Rivas, A. Kelkar

Abstract:

Cementitious materials are an excellent example of a composite material with complex hierarchical features and random features that range from nanometer (nm) to millimeter (mm) scale. Multi-scale modeling of complex material systems requires starting from fundamental building blocks to capture the scale relevant features through associated computational models. In this paper, molecular dynamics (MD) modeling is employed to predict the effect of plasticizer additive on the mechanical properties of key hydrated cement constituent calcium-silicate-hydrate (CSH) at the molecular, nanometer scale level. Due to complexity, still unknown molecular configuration of CSH, a representative configuration widely accepted in the field of mineral Jennite is employed. The effectiveness of the Molecular Dynamics modeling to understand the predictive influence of material chemistry changes based on molecular/nanoscale models is demonstrated.

Keywords: cement composite, mechanical properties, molecular dynamics, plasticizer additives

Procedia PDF Downloads 454