Search results for: new process model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28446

Search results for: new process model

27846 Thermodynamics of Stable Micro Black Holes Production by Modeling from the LHC

Authors: Aref Yazdani, Ali Tofighi

Abstract:

We study a simulative model for production of stable micro black holes based on investigation on thermodynamics of LHC experiment. We show that how this production can be achieved through a thermodynamic process of stability. Indeed, this process can be done through a very small amount of powerful fuel. By applying the second law of black hole thermodynamics at the scale of quantum gravity and perturbation expansion of the given entropy function, a time-dependent potential function is obtained which is illustrated with exact numerical values in higher dimensions. Seeking for the conditions for stability of micro black holes is another purpose of this study. This is proven through an injection method of putting the exact amount of energy into the final phase of the production which is equivalent to the same energy injection into the center of collision at the LHC in order to stabilize the produced particles. Injection of energy into the center of collision at the LHC is a new pattern that it is worth a try for the first time.

Keywords: micro black holes, LHC experiment, black holes thermodynamics, extra dimensions model

Procedia PDF Downloads 142
27845 Robust Batch Process Scheduling in Pharmaceutical Industries: A Case Study

Authors: Tommaso Adamo, Gianpaolo Ghiani, Antonio Domenico Grieco, Emanuela Guerriero

Abstract:

Batch production plants provide a wide range of scheduling problems. In pharmaceutical industries a batch process is usually described by a recipe, consisting of an ordering of tasks to produce the desired product. In this research work we focused on pharmaceutical production processes requiring the culture of a microorganism population (i.e. bacteria, yeasts or antibiotics). Several sources of uncertainty may influence the yield of the culture processes, including (i) low performance and quality of the cultured microorganism population or (ii) microbial contamination. For these reasons, robustness is a valuable property for the considered application context. In particular, a robust schedule will not collapse immediately when a cell of microorganisms has to be thrown away due to a microbial contamination. Indeed, a robust schedule should change locally in small proportions and the overall performance measure (i.e. makespan, lateness) should change a little if at all. In this research work we formulated a constraint programming optimization (COP) model for the robust planning of antibiotics production. We developed a discrete-time model with a multi-criteria objective, ordering the different criteria and performing a lexicographic optimization. A feasible solution of the proposed COP model is a schedule of a given set of tasks onto available resources. The schedule has to satisfy tasks precedence constraints, resource capacity constraints and time constraints. In particular time constraints model tasks duedates and resource availability time windows constraints. To improve the schedule robustness, we modeled the concept of (a, b) super-solutions, where (a, b) are input parameters of the COP model. An (a, b) super-solution is one in which if a variables (i.e. the completion times of a culture tasks) lose their values (i.e. cultures are contaminated), the solution can be repaired by assigning these variables values with a new values (i.e. the completion times of a backup culture tasks) and at most b other variables (i.e. delaying the completion of at most b other tasks). The efficiency and applicability of the proposed model is demonstrated by solving instances taken from Sanofi Aventis, a French pharmaceutical company. Computational results showed that the determined super-solutions are near-optimal.

Keywords: constraint programming, super-solutions, robust scheduling, batch process, pharmaceutical industries

Procedia PDF Downloads 617
27844 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 62
27843 Model Predictive Control of Three Phase Inverter for PV Systems

Authors: Irtaza M. Syed, Kaamran Raahemifar

Abstract:

This paper presents a model predictive control (MPC) of a utility interactive three phase inverter (TPI) for a photovoltaic (PV) system at commercial level. The proposed model uses phase locked loop (PLL) to synchronize TPI with the power electric grid (PEG) and performs MPC control in a dq reference frame. TPI model consists of boost converter (BC), maximum power point tracking (MPPT) control, and a three leg voltage source inverter (VSI). Operational model of VSI is used to synthesize sinusoidal current and track the reference. Model is validated using a 35.7 kW PV system in Matlab/Simulink. Implementation and results show simplicity and accuracy, as well as reliability of the model.

Keywords: model predictive control, three phase voltage source inverter, PV system, Matlab/simulink

Procedia PDF Downloads 592
27842 Estimation of Chronic Kidney Disease Using Artificial Neural Network

Authors: Ilker Ali Ozkan

Abstract:

In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.

Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis

Procedia PDF Downloads 444
27841 Digital Platforms: Creating Value through Network Effects under Pandemic Conditions

Authors: S. Łęgowik-Świącik

Abstract:

This article is a contribution to the research into the determinants of value creation via digital platforms in variable operating conditions. The dynamics of the market environment caused by the COVID-19 pandemic have made enterprises built on digital platforms financially successful. While many classic companies are struggling with the uncertainty of conducting a business and difficulties in the process of value creation, digital platforms create value by modifying the existing business model to meet the changing needs of customers. Therefore, the objective of this publication is to understand and explain the relationship between value creation and the conversion of the business model built on digital platforms under pandemic conditions. The considerations relating to the conceptual framework and determining the research objective allowed for adopting the hypothesis, assuming that the processes of value creation are evolving, and the measurement of these processes allows for the protection of value created and enables its growth in changing circumstances. The research methods, such as critical literature analysis and case study, were applied to accomplish the objective pursued and verify the hypothesis formulated. The empirical research was carried out based on the data from enterprises listed on the Nasdaq Stock Exchange: Amazon, Alibaba, and Facebook. The research period was the years 2018-2021. The surveyed enterprises were chosen based on the targeted selection. The problem discussed is important and current since the lack of in-depth theoretical research results in few attempts to identify the determinants of value creation via digital platforms. The above arguments led to an attempt at theoretical analysis and empirical research to fill in the gap perceived by deepening the understanding of the process of value creation through network effects via digital platforms under pandemic conditions.

Keywords: business model, digital platforms, enterprise management, pandemic conditions, value creation process

Procedia PDF Downloads 128
27840 The Concept of an Agile Enterprise Research Model

Authors: Maja Sajdak

Abstract:

The aim of this paper is to present the concept of an agile enterprise model and to initiate discussion on the research assumptions of the model presented. The implementation of the research project "The agility of enterprises in the process of adapting to the environment and its changes" began in August 2014 and is planned to last three years. The article has the form of a work-in-progress paper which aims to verify and initiate a debate over the proposed research model. In the literature there are very few publications relating to research into agility; it can be concluded that the most controversial issue in this regard is the method of measuring agility. In previous studies the operationalization of agility was often fragmentary, focusing only on selected areas of agility, for example manufacturing, or analysing only selected sectors. As a result the measures created to date can only be treated as contributory to the development of precise measurement tools. This research project aims to fill a cognitive gap in the literature with regard to the conceptualization and operationalization of an agile company. Thus, the original contribution of the author of this project is the construction of a theoretical model that integrates manufacturing agility (consisting mainly in adaptation to the environment) and strategic agility (based on proactive measures). The author of this research project is primarily interested in the attributes of an agile enterprise which indicate that the company is able to rapidly adapt to changing circumstances and behave pro-actively.

Keywords: agile company, acuity, entrepreneurship, flexibility, research model, strategic leadership

Procedia PDF Downloads 343
27839 Evolution of Relations among Multiple Institutional Logics: A Case Study from a Higher Education Institution

Authors: Ye Jiang

Abstract:

To examine how the relationships among multiple institutional logics vary over time and the factors that may impact this process, we conducted a 15-year in-depth longitudinal case study of a Higher Education Institution to examine its exploration in college student management. By employing constructive grounded theory, we developed a four-stage process model comprising separation, formalization, selective bridging, and embeddedness that showed how two contradictory logics become complementary, and finally become a new hybridized logic. We argue that selective bridging is an important step in changing inter-logic relations. We also found that ambidextrous leadership and situational sensemaking are two key factors that drive this process. Our contribution to the literature is threefold. First, we enhance the literature on the changing relationships among multiple institutional logics and our findings advance the understanding of relationships between multiple logics through a dynamic view. While most studies have tended to assume that the relationship among logics is static and persistently in a contentious state, we contend that the relationships among multiple institutional logics can change over time. Competitive logics can become complementary, and a new hybridized logic can emerge therefrom. The four-stage logic hybridization process model offers insights on the logic hybridization process, which is underexplored in the literature. Second, our research reveals that selective bridging is important in making conflicting logics compatible, and thus constitutes a key step in creating new hybridized logic dynamics. Our findings suggest that the relations between multiple logics are manageable and can thus be manipulated for organizational innovation. Finally, the factors influencing the variations in inter-logic relations enrich the understanding of the antecedents of these dynamics.

Keywords: institutional theory, institutional logics, ambidextrous leadership, situational sensemaking

Procedia PDF Downloads 163
27838 The Importance of Patenting and Technology Exports as Indicators of Economic Development

Authors: Hugo Rodríguez

Abstract:

The patenting of inventions is the result of an organized effort to achieve technological improvement and its consequent positive impact on the population's standard of living. Technology exports, either of high-tech goods or of Information and Communication Technology (ICT) services, represent the level of acceptance that world markets have of that technology acquired or developed by a country, either in public or private settings. A quantitative measure of the above variables is expected to have a positive and relevant impact on the level of economic development of the countries, measured on this first occasion through their level of Gross Domestic Product (GDP). And in that sense, it not only explains the performance of an economy but the difference between nations. We present an econometric model where we seek to explain the difference between the GDP levels of 178 countries through their different performance in the outputs of the technological production process. We take the variables of Patenting, ICT Exports and High Technology Exports as results of the innovation process. This model achieves an explanatory power for four annual cuts (2000, 2005, 2010 and 2015) equivalent to an adjusted r2 of 0.91, 0.87, 0.91 and 0.96, respectively.

Keywords: Development, exports, patents, technology

Procedia PDF Downloads 109
27837 Simulation Study on Polymer Flooding with Thermal Degradation in Elevated-Temperature Reservoirs

Authors: Lin Zhao, Hanqiao Jiang, Junjian Li

Abstract:

Polymers injected into elevated-temperature reservoirs inevitably suffer from thermal degradation, resulting in severe viscosity loss and poor flooding performance. However, for polymer flooding in such reservoirs, present simulators fail to provide accurate results for lack of description on thermal degradation. In light of this, the objectives of this paper are to provide a simulation model for polymer flooding with thermal degradation and study the effect of thermal degradation on polymer flooding in elevated-temperature reservoirs. Firstly, a thermal degradation experiment was conducted to obtain the degradation law of polymer concentration and viscosity. Different types of polymers degraded in the Thermo tank with elevated temperatures. Afterward, based on the obtained law, a streamline-assistant model was proposed to simulate the degradation process under in-situ flow conditions. Model validation was performed with field data from a well group of an offshore oilfield. Finally, the effect of thermal degradation on polymer flooding was studied using the proposed model. Experimental results showed that the polymer concentration remained unchanged, while the viscosity degraded exponentially with time after degradation. The polymer viscosity was functionally dependent on the polymer degradation time (PDT), which represented the elapsed time started from the polymer particle injection. Tracing the real flow path of polymer particle was required. Therefore, the presented simulation model was streamline-assistant. Equation of PDT vs. time of flight (TOF) along streamline was built by the law of polymer particle transport. Based on the field polymer sample and dynamic data, the new model proved its accuracy. Study of degradation effect on polymer flooding indicated: (1) the viscosity loss increased with TOF exponentially in the main body of polymer-slug and remained constant in the slug front; (2) the responding time of polymer flooding was delayed, but the effective time was prolonged; (3) the breakthrough of subsequent water was eased; (4) the capacity of polymer adjusting injection profile was diminished; (5) the incremental recovery was reduced significantly. In general, the effect of thermal degradation on polymer flooding performance was rather negative. This paper provides a more comprehensive insight into polymer thermal degradation in both the physical process and field application. The proposed simulation model offers an effective means for simulating the polymer flooding process with thermal degradation. The negative effect of thermal degradation suggests that the polymer thermal stability should be given full consideration when designing polymer flooding project in elevated-temperature reservoirs.

Keywords: polymer flooding, elevated-temperature reservoir, thermal degradation, numerical simulation

Procedia PDF Downloads 138
27836 Proactive Competence Management for Employees: A Bottom-up Process Model for Developing Target Competence Profiles Based on the Employee's Tasks

Authors: Maximilian Cedzich, Ingo Dietz Von Bayer, Roland Jochem

Abstract:

In order for industrial companies to continue to succeed in dynamic, globalized markets, they must be able to train their employees in an agile manner and at short notice in line with the exogenous conditions that arise. For this purpose, it is indispensable to operate a proactive competence management system for employees that recognizes qualification needs timely in order to be able to address them promptly through qualification measures. However, there are hardly any approaches to be found in the literature that includes systematic, proactive competence management. In order to help close this gap, this publication presents a process model that systematically develops bottom-up, future-oriented target competence profiles based on the tasks of the employees. Concretely, in the first step, the tasks of the individual employees are examined for assumed future conditions. In other words, qualitative scenarios are considered for the individual tasks to determine how they are likely to change. In a second step, these scenario-based future tasks are translated into individual future-related target competencies of the employee using a matrix of generic task properties. The final step pursues the goal of validating the target competence profiles formed in this way within the framework of a management workshop. This process model provides industrial companies with a tool that they can use to determine the competencies required by their own employees in the future and compare them with the actual prevailing competencies. If gaps are identified between the target and the actual, these qualification requirements can be closed in the short term by means of qualification measures.

Keywords: dynamic globalized markets, employee competence management, industrial companies, knowledge management

Procedia PDF Downloads 187
27835 Model Observability – A Monitoring Solution for Machine Learning Models

Authors: Amreth Chandrasehar

Abstract:

Machine Learning (ML) Models are developed and run in production to solve various use cases that help organizations to be more efficient and help drive the business. But this comes at a massive development cost and lost business opportunities. According to the Gartner report, 85% of data science projects fail, and one of the factors impacting this is not paying attention to Model Observability. Model Observability helps the developers and operators to pinpoint the model performance issues data drift and help identify root cause of issues. This paper focuses on providing insights into incorporating model observability in model development and operationalizing it in production.

Keywords: model observability, monitoring, drift detection, ML observability platform

Procedia PDF Downloads 110
27834 A Data-Driven Agent Based Model for the Italian Economy

Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio

Abstract:

We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.

Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data

Procedia PDF Downloads 69
27833 Advanced Analysis on Dissemination of Pollutant Caused by Flaring System Effect Using Computational Fluid Dynamics (CFD) Fluent Model with WRF Model Input in Transition Season

Authors: Benedictus Asriparusa

Abstract:

In the area of the oil industry, there is accompanied by associated natural gas. The thing shows that a large amount of energy is being wasted mostly in the developing countries by contributing to the global warming process. This research represents an overview of methods in Minas area employed by these researchers in PT. Chevron Pacific Indonesia to determine ways of measuring and reducing gas flaring and its emission drastically. It provides an approximation includes analytical studies, numerical studies, modeling, computer simulations, etc. Flaring system is the controlled burning of natural gas in the course of routine oil and gas production operations. This burning occurs at the end of a flare stack or boom. The combustion process will release emissions of greenhouse gases such as NO2, CO2, SO2, etc. This condition will affect the air and environment around the industrial area. Therefore, we need a simulation to create the pattern of the dissemination of pollutant. This research paper has being made to see trends in gas flaring model and current developments to predict dominant variable which gives impact to dissemination of pollutant. Fluent models used to simulate the distribution of pollutant gas coming out of the stack. While WRF model output is used to overcome the limitations of the analysis of meteorological data and atmospheric conditions in the study area. This study condition focused on transition season in 2012 at Minas area. The goal of the simulation is looking for the exact time which is most influence towards dissemination of pollutants. The most influence factor divided into two main subjects. It is the quickest wind and the slowest wind. According to the simulation results, it can be seen that quickest wind moves to horizontal way and slowest wind moves to vertical way.

Keywords: flaring system, fluent model, dissemination of pollutant, transition season

Procedia PDF Downloads 379
27832 Modelling Magnetohydrodynamics to Investigate Variation of Shielding Gases on Arc Characteristics in the GTAW Process

Authors: Stuart W. Campbell, Alexander M. Galloway, Norman A. McPherson, Duncan Camilleri, Daniel Micallef

Abstract:

Gas tungsten arc welding requires a gas shield to be present in order to protect the arc area from contamination by atmospheric gases. As a result of each gas having its own unique thermophysical properties, the shielding gas selected can have a major influence on the arc stability, welding speed, weld appearance and geometry, mechanical properties and fume generation. Alternating shielding gases is a relatively new method of discreetly supplying two different shielding gases to the welding region in order to take advantage of the beneficial properties of each gas, as well as the inherent pulsing effects generated. As part of an ongoing process to fully evaluate the effects of this novel supply method, a computational fluid dynamics model has been generated to include the gas dependent thermodynamic and transport properties in order to evaluate the effects that an alternating gas supply has on the arc plasma. Experimental trials have also been conducted to validate the model arc profile predictions.

Keywords: Alternating shielding gases, ANSYS CFX, Gas tungsten arc welding(GTAW), magnetohydrodynamics(MHD)

Procedia PDF Downloads 435
27831 The Fluid Limit of the Critical Processor Sharing Tandem Queue

Authors: Amal Ezzidani, Abdelghani Ben Tahar, Mohamed Hanini

Abstract:

A sequence of finite tandem queue is considered for this study. Each one has a single server, which operates under the egalitarian processor sharing discipline. External customers arrive at each queue according to a renewal input process and having a general service times distribution. Upon completing service, customers leave the current queue and enter to the next. Under mild assumptions, including critical data, we prove the existence and the uniqueness of the fluid solution. For asymptotic behavior, we provide necessary and sufficient conditions for the invariant state and the convergence to this invariant state. In the end, we establish the convergence of a correctly normalized state process to a fluid limit characterized by a system of algebraic and integral equations.

Keywords: fluid limit, fluid model, measure valued process, processor sharing, tandem queue

Procedia PDF Downloads 319
27830 Spatial Behavioral Model-Based Dynamic Data-Driven Diagram Information Model

Authors: Chiung-Hui Chen

Abstract:

Diagram and drawing are important ways to communicate and the reproduce of architectural design, Due to the development of information and communication technology, the professional thinking of architecture and interior design are also change rapidly. In development process of design, diagram always play very important role. This study is based on diagram theories, observe and record interaction between man and objects, objects and space, and space and time in a modern nuclear family. Construct a method for diagram to systematically and visualized describe the space plan of a modern nuclear family toward a intelligent design, to assist designer to retrieve information and check/review event pattern of past and present.

Keywords: digital diagram, information model, context aware, data analysis

Procedia PDF Downloads 331
27829 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model

Authors: S. A. Sadegh Zadeh, C. Kambhampati

Abstract:

Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.

Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential

Procedia PDF Downloads 614
27828 Design of Identification Based Adaptive Control for Fermentation Process in Bioreactor

Authors: J. Ritonja

Abstract:

The biochemical technology has been developing extremely fast since the middle of the last century. The main reason for such development represents a requirement for large production of high-quality biologically manufactured products such as pharmaceuticals, foods, and beverages. The impact of the biochemical industry on the world economy is enormous. The great importance of this industry also results in intensive development in scientific disciplines relevant to the development of biochemical technology. In addition to developments in the fields of biology and chemistry, which enable to understand complex biochemical processes, development in the field of control theory and applications is also very important. In the paper, the control for the biochemical reactor for the milk fermentation was studied. During the fermentation process, the biophysical quantities must be precisely controlled to obtain the high-quality product. To control these quantities, the bioreactor’s stirring drive and/or heating system can be used. Available commercial biochemical reactors are equipped with open loop or conventional linear closed loop control system. Due to the outstanding parameters variations and the partial nonlinearity of the biochemical process, the results obtained with these control systems are not satisfactory. To improve the fermentation process, the self-tuning adaptive control system was proposed. The use of the self-tuning adaptive control is suggested because the parameters’ variations of the studied biochemical process are very slow in most cases. To determine the linearized mathematical model of the fermentation process, the recursive least square identification method was used. Based on the obtained mathematical model the linear quadratic regulator was tuned. The parameters’ identification and the controller’s synthesis are executed on-line and adapt the controller’s parameters to the fermentation process’ dynamics during the operation. The use of the proposed combination represents the original solution for the control of the milk fermentation process. The purpose of the paper is to contribute to the progress of the control systems for the biochemical reactors. The proposed adaptive control system was tested thoroughly. From the obtained results it is obvious that the proposed adaptive control system assures much better following of the reference signal as a conventional linear control system with fixed control parameters.

Keywords: adaptive control, biochemical reactor, linear quadratic regulator, recursive least square identification

Procedia PDF Downloads 124
27827 Constructing a Semi-Supervised Model for Network Intrusion Detection

Authors: Tigabu Dagne Akal

Abstract:

While advances in computer and communications technology have made the network ubiquitous, they have also rendered networked systems vulnerable to malicious attacks devised from a distance. These attacks or intrusions start with attackers infiltrating a network through a vulnerable host and then launching further attacks on the local network or Intranet. Nowadays, system administrators and network professionals can attempt to prevent such attacks by developing intrusion detection tools and systems using data mining technology. In this study, the experiments were conducted following the Knowledge Discovery in Database Process Model. The Knowledge Discovery in Database Process Model starts from selection of the datasets. The dataset used in this study has been taken from Massachusetts Institute of Technology Lincoln Laboratory. After taking the data, it has been pre-processed. The major pre-processing activities include fill in missed values, remove outliers; resolve inconsistencies, integration of data that contains both labelled and unlabelled datasets, dimensionality reduction, size reduction and data transformation activity like discretization tasks were done for this study. A total of 21,533 intrusion records are used for training the models. For validating the performance of the selected model a separate 3,397 records are used as a testing set. For building a predictive model for intrusion detection J48 decision tree and the Naïve Bayes algorithms have been tested as a classification approach for both with and without feature selection approaches. The model that was created using 10-fold cross validation using the J48 decision tree algorithm with the default parameter values showed the best classification accuracy. The model has a prediction accuracy of 96.11% on the training datasets and 93.2% on the test dataset to classify the new instances as normal, DOS, U2R, R2L and probe classes. The findings of this study have shown that the data mining methods generates interesting rules that are crucial for intrusion detection and prevention in the networking industry. Future research directions are forwarded to come up an applicable system in the area of the study.

Keywords: intrusion detection, data mining, computer science, data mining

Procedia PDF Downloads 294
27826 Optimising the Reservoir Operation Using Water Resources Yield and Planning Model at Inanda Dam, uMngeni Basin

Authors: O. Nkwonta, B. Dzwairo, F. Otieno, J. Adeyemo

Abstract:

The effective management of water resources is of great importance to ensure the supply of water resources to support changing water requirements over a selected planning horizon and in a sustainable and cost-effective way. Essentially, the purpose of the water resources planning process is to balance the available water resources in a system with the water requirements and losses to which the system is subjected. In such situations, water resources yield and planning model can be used to solve those difficulties. It has an advantage over other models by managing model runs, developing a representative system network, modelling incremental sub-catchments, creating a variety of standard system features, special modelling features, and run result output options.

Keywords: complex, water resources, planning, cost effective, management

Procedia PDF Downloads 449
27825 A Framework for Event-Based Monitoring of Business Processes in the Supply Chain Management of Industry 4.0

Authors: Johannes Atug, Andreas Radke, Mitchell Tseng, Gunther Reinhart

Abstract:

In modern supply chains, large numbers of SKU (Stock-Keeping-Unit) need to be timely managed, and any delays in noticing disruptions of items often limit the ability to defer the impact on customer order fulfillment. However, in supply chains of IoT-connected enterprises, the ERP (Enterprise-Resource-Planning), the MES (Manufacturing-Execution-System) and the SCADA (Supervisory-Control-and-Data-Acquisition) systems generate large amounts of data, which generally glean much earlier notice of deviations in the business process steps. That is, analyzing these streams of data with process mining techniques allows the monitoring of the supply chain business processes and thus identification of items that deviate from the standard order fulfillment process. In this paper, a framework to enable event-based SCM (Supply-Chain-Management) processes including an overview of core enabling technologies are presented, which is based on the RAMI (Reference-Architecture-Model for Industrie 4.0) architecture. The application of this framework in the industry is presented, and implications for SCM in industry 4.0 and further research are outlined.

Keywords: cyber-physical production systems, event-based monitoring, supply chain management, RAMI (Reference-Architecture-Model for Industrie 4.0)

Procedia PDF Downloads 236
27824 Continual Learning Using Data Generation for Hyperspectral Remote Sensing Scene Classification

Authors: Samiah Alammari, Nassim Ammour

Abstract:

When providing a massive number of tasks successively to a deep learning process, a good performance of the model requires preserving the previous tasks data to retrain the model for each upcoming classification. Otherwise, the model performs poorly due to the catastrophic forgetting phenomenon. To overcome this shortcoming, we developed a successful continual learning deep model for remote sensing hyperspectral image regions classification. The proposed neural network architecture encapsulates two trainable subnetworks. The first module adapts its weights by minimizing the discrimination error between the land-cover classes during the new task learning, and the second module tries to learn how to replicate the data of the previous tasks by discovering the latent data structure of the new task dataset. We conduct experiments on HSI dataset Indian Pines. The results confirm the capability of the proposed method.

Keywords: continual learning, data reconstruction, remote sensing, hyperspectral image segmentation

Procedia PDF Downloads 264
27823 1D/3D Modeling of a Liquid-Liquid Two-Phase Flow in a Milli-Structured Heat Exchanger/Reactor

Authors: Antoinette Maarawi, Zoe Anxionnaz-Minvielle, Pierre Coste, Nathalie Di Miceli Raimondi, Michel Cabassud

Abstract:

Milli-structured heat exchanger/reactors have been recently widely used, especially in the chemical industry, due to their enhanced performances in heat and mass transfer compared to conventional apparatuses. In our work, the ‘DeanHex’ heat exchanger/reactor with a 2D-meandering channel is investigated both experimentally and numerically. The square cross-sectioned channel has a hydraulic diameter of 2mm. The aim of our study is to model local physico-chemical phenomena (heat and mass transfer, axial dispersion, etc.) for a liquid-liquid two-phase flow in our lab-scale meandering channel, which represents the central part of the heat exchanger/reactor design. The numerical approach of the reactor is based on a 1D model for the flow channel encapsulated in a 3D model for the surrounding solid, using COMSOL Multiphysics V5.5. The use of the 1D approach to model the milli-channel reduces significantly the calculation time compared to 3D approaches, which are generally focused on local effects. Our 1D/3D approach intends to bridge the gap between the simulation at a small scale and the simulation at the reactor scale at a reasonable CPU cost. The heat transfer process between the 1D milli-channel and its 3D surrounding is modeled. The feasibility of this 1D/3D coupling was verified by comparing simulation results to experimental ones originated from two previous works. Temperature profiles along the channel axis obtained by simulation fit the experimental profiles for both cases. The next step is to integrate the liquid-liquid mass transfer model and to validate it with our experimental results. The hydrodynamics of the liquid-liquid two-phase system is modeled using the ‘mixture model approach’. The mass transfer behavior is represented by an overall volumetric mass transfer coefficient ‘kLa’ correlation obtained from our experimental results in the millimetric size meandering channel. The present work is a first step towards the scale-up of our ‘DeanHex’ expecting future industrialization of such equipment. Therefore, a generalized scaled-up model of the reactor comprising all the transfer processes will be built in order to predict the performance of the reactor in terms of conversion rate and energy efficiency at an industrial scale.

Keywords: liquid-liquid mass transfer, milli-structured reactor, 1D/3D model, process intensification

Procedia PDF Downloads 130
27822 Numerical Approach for Characterization of Flow Field in Pump Intake Using Two Phase Model: Detached Eddy Simulation

Authors: Rahul Paliwal, Gulshan Maheshwari, Anant S. Jhaveri, Channamallikarjun S. Mathpati

Abstract:

Large pumping facility is the necessary requirement of the cooling water systems for power plants, process and manufacturing facilities, flood control and water or waste water treatment plant. With a large capacity of few hundred to 50,000 m3/hr, cares must be taken to ensure the uniform flow to the pump to limit vibration, flow induced cavitation and performance problems due to formation of air entrained vortex and swirl flow. Successful prediction of these phenomena requires numerical method and turbulence model to characterize the dynamics of these flows. In the past years, single phase shear stress transport (SST) Reynolds averaged Navier Stokes Models (like k-ε, k-ω and RSM) were used to predict the behavior of flow. Literature study showed that two phase model will be more accurate over single phase model. In this paper, a 3D geometries simulated using detached eddy simulation (LES) is used to predict the behavior of the fluid and the results are compared with experimental results. Effect of different grid structure and boundary condition is also studied. It is observed that two phase flow model can more accurately predict the mean flow and turbulence statistics compared to the steady SST model. These validate model will be used for further analysis of vortex structure in lab scale model to generate their frequency-plot and intensity at different location in the set-up. This study will help in minimizing the ill effect of vortex on pump performance.

Keywords: grid structure, pump intake, simulation, vibration, vortex

Procedia PDF Downloads 174
27821 A Comprehensive Finite Element Model for Incremental Launching of Bridges: Optimizing Construction and Design

Authors: Mohammad Bagher Anvari, Arman Shojaei

Abstract:

Incremental launching, a widely adopted bridge erection technique, offers numerous advantages for bridge designers. However, accurately simulating and modeling the dynamic behavior of the bridge during each step of the launching process proves to be tedious and time-consuming. The perpetual variation of internal forces within the deck during construction stages adds complexity, exacerbated further by considerations of other load cases, such as support settlements and temperature effects. As a result, there is an urgent need for a reliable, simple, economical, and fast algorithmic solution to model bridge construction stages effectively. This paper presents a novel Finite Element (FE) model that focuses on studying the static behavior of bridges during the launching process. Additionally, a simple method is introduced to normalize all quantities in the problem. The new FE model overcomes the limitations of previous models, enabling the simulation of all stages of launching, which conventional models fail to achieve due to underlying assumptions. By leveraging the results obtained from the new FE model, this study proposes solutions to improve the accuracy of conventional models, particularly for the initial stages of bridge construction that have been neglected in previous research. The research highlights the critical role played by the first span of the bridge during the initial stages, a factor often overlooked in existing studies. Furthermore, a new and simplified model termed the "semi-infinite beam" model, is developed to address this oversight. By utilizing this model alongside a simple optimization approach, optimal values for launching nose specifications are derived. The practical applications of this study extend to optimizing the nose-deck system of incrementally launched bridges, providing valuable insights for practical usage. In conclusion, this paper introduces a comprehensive Finite Element model for studying the static behavior of bridges during incremental launching. The proposed model addresses limitations found in previous approaches and offers practical solutions to enhance accuracy. The study emphasizes the importance of considering the initial stages and introduces the "semi-infinite beam" model. Through the developed model and optimization approach, optimal specifications for launching nose configurations are determined. This research holds significant practical implications and contributes to the optimization of incrementally launched bridges, benefiting both the construction industry and bridge designers.

Keywords: incremental launching, bridge construction, finite element model, optimization

Procedia PDF Downloads 99
27820 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies

Authors: Monica Lia

Abstract:

This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.

Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes

Procedia PDF Downloads 428
27819 Development of Natural Zeolites Adsorbent: Preliminary Study on Water-Isopropyl Alcohol Adsorption in a Close-Loop Continuous Adsorber

Authors: Sang Kompiang Wirawan, Pandu Prabowo Jati, I Wayan Warmada

Abstract:

Klaten Indonesian natural zeolite can be used as powder or pellet adsorbent. Pellet adsorbent has been made from activated natural zeolite powder by a conventional pressing method. Starch and formaldehyde were added as binder to strengthen the construction of zeolite pellet. To increase the absorptivity and its capacity, natural zeolite was activated first chemically and thermally. This research examined adsorption process of water from Isopropyl Alcohol (IPA)-water system using zeolite adsorbent pellet from natural zeolite powder which has been activated with H2SO4 0.1 M and 0.3 M. Adsorbent was pelleted by pressing apparatus at certain pressure to make specification in 1.96 cm diameter, 0.68 cm thickness which the natural zeolite powder (-80 mesh). The system of isopropyl-alcohol water contained 80% isopropyl-alcohol. Adsorption process was held in close-loop continuous apparatus which the zeolite pellet was put inside a column and the solution of IPA-water was circulated at certain flow. Concentration changing was examined thoroughly at a certain time. This adsorption process included mass transfer from bulk liquid into film layer and from film layer into the solid particle. Analysis of rate constant was using first order isotherm model that simulated with MATLAB. Besides using first order isotherm, intra-particle diffusion model was proposed by using pore diffusion model. The study shows that adsorbent activated by H2SO4 0.1 M has good absorptivity with mass transfer constant at 0.1286 min-1.

Keywords: intra-particle diffusion, fractional attainment, first order isotherm, zeolite

Procedia PDF Downloads 308
27818 Recovery of Fried Soybean Oil Using Bentonite as an Adsorbent: Optimization, Isotherm and Kinetics Studies

Authors: Prakash Kumar Nayak, Avinash Kumar, Uma Dash, Kalpana Rayaguru

Abstract:

Soybean oil is one of the most widely consumed cooking oils, worldwide. Deep-fat frying of foods at higher temperatures adds unique flavour, golden brown colour and crispy texture to foods. But it brings in various changes like hydrolysis, oxidation, hydrogenation and thermal alteration to oil. The presence of Peroxide value (PV) is one of the most important factors affecting the quality of the deep-fat fried oil. Using bentonite as an adsorbent, the PV can be reduced, thereby improving the quality of the soybean oil. In this study, operating parameters like heating time of oil (10, 15, 20, 25 & 30 h), contact time ( 5, 10, 15, 20, 25 h) and concentration of adsorbent (0.25, 0.5, 0.75, 1.0 and 1.25 g/ 100 ml of oil) have been optimized by response surface methodology (RSM) considering percentage reduction of PV as a response. Adsorption data were analysed by fitting with Langmuir and Freundlich isotherm model. The results show that the Langmuir model shows the best fit compared to the Freundlich model. The adsorption process was also found to follow a pseudo-second-order kinetic model.

Keywords: bentonite, Langmuir isotherm, peroxide value, RSM, soybean oil

Procedia PDF Downloads 372
27817 Multivariate Statistical Process Monitoring of Base Metal Flotation Plant Using Dissimilarity Scale-Based Singular Spectrum Analysis

Authors: Syamala Krishnannair

Abstract:

A multivariate statistical process monitoring methodology using dissimilarity scale-based singular spectrum analysis (SSA) is proposed for the detection and diagnosis of process faults in the base metal flotation plant. Process faults are detected based on the multi-level decomposition of process signals by SSA using the dissimilarity structure of the process data and the subsequent monitoring of the multiscale signals using the unified monitoring index which combines T² with SPE. Contribution plots are used to identify the root causes of the process faults. The overall results indicated that the proposed technique outperformed the conventional multivariate techniques in the detection and diagnosis of the process faults in the flotation plant.

Keywords: fault detection, fault diagnosis, process monitoring, dissimilarity scale

Procedia PDF Downloads 207