Search results for: process optimizing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15690

Search results for: process optimizing

15450 Research on Straightening Process Model Based on Iteration and Self-Learning

Authors: Hong Lu, Xiong Xiao

Abstract:

Shaft parts are widely used in machinery industry, however, bending deformation often occurred when this kind of parts is being heat treated. This parts needs to be straightened to meet the requirement of straightness. As for the pressure straightening process, a good straightening stroke algorithm is related to the precision and efficiency of straightening process. In this paper, the relationship between straightening load and deflection during the straightening process is analyzed, and the mathematical model of the straightening process has been established. By the mathematical model, the iterative method is used to solve the straightening stroke. Compared to the traditional straightening stroke algorithm, straightening stroke calculated by this method is much more precise; because it can adapt to the change of material performance parameters. Considering that the straightening method is widely used in the mass production of the shaft parts, knowledge base is used to store the data of the straightening process, and a straightening stroke algorithm based on empirical data is set up. In this paper, the straightening process control model which combine the straightening stroke method based on iteration and straightening stroke algorithm based on empirical data has been set up. Finally, an experiment has been designed to verify the straightening process control model.

Keywords: straightness, straightening stroke, deflection, shaft parts

Procedia PDF Downloads 322
15449 Enhancement in Digester Efficiency and Numerical Analysis for Optimal Design Parameters of Biogas Plant Using Design of Experiment Approach

Authors: Rajneesh, Priyanka Singh

Abstract:

Biomass resources have been one of the main energy sources for mankind since the dawn of civilization. There is a vast scope to convert these energy sources into biogas which is a clean, low carbon technology for efficient management and conversion of fermentable organic wastes into a cheap and versatile fuel and bio/organic manure. Thus, in order to enhance the performance of anaerobic digester, an optimizing analysis of resultant parameters (organic dry matter (oDM) content, methane percentage, and biogas yield) has been done for a plug flow anaerobic digester having mesophilic conditions (20-40°C) with the wet fermentation process. Based on the analysis, correlations for oDM, methane percentage, and biogas yield are derived using multiple regression analysis. A statistical model is developed to correlate the operating variables using the design of experiment approach by selecting central composite design (CCD) of a response surface methodology. Results shown in the paper indicates that as the operating temperature increases the efficiency of digester gets improved provided that the pH and hydraulic retention time (HRT) remains constant. Working in an optimized range of carbon-nitrogen ratio for the plug flow digester, the output parameters show a positive change with the variation of dry matter content (DM).

Keywords: biogas, digester efficiency, design of experiment, plug flow digester

Procedia PDF Downloads 373
15448 Removal of Toxic Ni++ Ions from Wastewater by Nano-Bentonite

Authors: A. M. Ahmed, Mona A. Darwish

Abstract:

Removal of Ni++ ions from aqueous solution by sorption ontoNano-bentonite was investigated. Experiments were carried out as a function amount of Nano-bentonite, pH, concentration of metal, constant time, agitation speed and temperature. The adsorption parameter of metal ions followed the Langmuir Freundlich adsorption isotherm were applied to analyze adsorption data. The adsorption process has fit pseudo-second order kinetic models. Thermodynamics parameters e.g.ΔG*, ΔS °and ΔH ° of adsorption process have also been calculated and the sorption process was found to be endothermic. The adsorption process has fit pseudo-second order kinetic models. Langmuir and Freundich adsorption isotherm models were applied to analyze adsorption data and both were found to be applicable to the adsorption process. Thermodynamic parameters, e.g., ∆G °, ∆S ° and ∆H ° of the on-going adsorption process have also been calculated and the sorption process was found to be endothermic. Finally, it can be seen that Bentonite was found to be more effective for the removal of Ni (II) same with some experimental conditions.

Keywords: waste water, nickel, bentonite, adsorption

Procedia PDF Downloads 253
15447 Assessment of Factors Influencing Business Process Harmonization: A Case Study in an Industrial Company

Authors: J. J. M. Trienekens, H. L. Romero, L. Cuenca

Abstract:

While process harmonization is increasingly mentioned and unanimously associated with several benefits, there is a need for more understanding of how it contributes to business process redesign and improvement. This paper presents the application, in an industrial case study, of a conceptual harmonization model on the relationship between drivers and effects of process harmonization. The drivers are called contextual factors which influence harmonization. Assessment of these contextual factors in a particular business domain, clarifies the extent of harmonization that can be achieved, or that should be strived at. The case study shows how the conceptual harmonization model can be made operational and can act as a valuable assessment tool. From both qualitative, as well as some quantitative, assessment results, insights are being discussed on the extent of harmonization that can be achieved, and action plans are being defined for business (process) harmonization.

Keywords: case study, contextual factors, process harmonization, industrial company

Procedia PDF Downloads 387
15446 A Distributed Smart Battery Management System – sBMS, for Stationary Energy Storage Applications

Authors: António J. Gano, Carmen Rangel

Abstract:

Currently, electric energy storage systems for stationary applications have known an increasing interest, namely with the integration of local renewable energy power sources into energy communities. Li-ion batteries are considered the leading electric storage devices to achieve this integration, and Battery Management Systems (BMS) are decisive for their control and optimum performance. In this work, the advancement of a smart BMS (sBMS) prototype with a modular distributed topology is described. The system, still under development, has a distributed architecture with modular characteristics to operate with different battery pack topologies and charge capacities, integrating adaptive algorithms for functional state real-time monitoring and management of multicellular Li-ion batteries, and is intended for application in the context of a local energy community fed by renewable energy sources. This sBMS system includes different developed hardware units: (1) Cell monitoring units (CMUs) for interfacing with each individual cell or module monitoring within the battery pack; (2) Battery monitoring and switching unit (BMU) for global battery pack monitoring, thermal control and functional operating state switching; (3) Main management and local control unit (MCU) for local sBMS’s management and control, also serving as a communications gateway to external systems and devices. This architecture is fully expandable to battery packs with a large number of cells, or modules, interconnected in series, as the several units have local data acquisition and processing capabilities, communicating over a standard CAN bus and will be able to operate almost autonomously. The CMU units are intended to be used with Li-ion cells but can be used with other cell chemistries, with output voltages within the 2.5 to 5 V range. The different unit’s characteristics and specifications are described, including the different implemented hardware solutions. The developed hardware supports both passive and active methods for charge equalization, considered fundamental functionalities for optimizing the performance and the useful lifetime of a Li-ion battery package. The functional characteristics of the different units of this sBMS system, including different process variables data acquisition using a flexible set of sensors, can support the development of custom algorithms for estimating the parameters defining the functional states of the battery pack (State-of-Charge, State-of-Health, etc.) as well as different charge equalizing strategies and algorithms. This sBMS system is intended to interface with other systems and devices using standard communication protocols, like those used by the Internet of Things. In the future, this sBMS architecture can evolve to a fully decentralized topology, with all the units using Wi-Fi protocols and integrating a mesh network, making unnecessary the MCU unit. The status of the work in progress is reported, leading to conclusions on the system already executed, considering the implemented hardware solution, not only as fully functional advanced and configurable battery management system but also as a platform for developing custom algorithms and optimizing strategies to achieve better performance of electric energy stationary storage devices.

Keywords: Li-ion battery, smart BMS, stationary electric storage, distributed BMS

Procedia PDF Downloads 96
15445 Selecting the Best Software Product Using Analytic Hierarchy Process and Fuzzy-Analytic Hierarchy Process Modules

Authors: Anas Hourani, Batool Ahmad

Abstract:

Software applications play an important role inside any institute. They are employed to manage all processes and store entities-related data in the computer. Therefore, choosing the right software product that meets institute requirements is not an easy decision in view of considering multiple criteria, different points of views, and many standards. As a case study, Mutah University, located in Jordan, is in essential need of customized software, and several companies presented their software products which are very similar in quality. In this regard, an analytic hierarchy process (AHP) and a fuzzy analytic hierarchy process (Fuzzy-AHP) models are proposed in this research to identify the most suitable and best-fit software product that meets the institute requirements. The results indicate that both modules are able to help the decision-makers to make a decision, especially in complex decision problems.

Keywords: analytic hierarchy process, decision modeling, fuzzy analytic hierarchy process, software product

Procedia PDF Downloads 386
15444 Potential of Mineral Composition Reconstruction for Monitoring the Performance of an Iron Ore Concentration Plant

Authors: Maryam Sadeghi, Claude Bazin, Daniel Hodouin, Laura Perez Barnuevo

Abstract:

The performance of a separation process is usually evaluated using performance indices calculated from elemental assays readily available from the chemical analysis laboratory. However, the separation process performance is essentially related to the properties of the minerals that carry the elements and not those of the elements. Since elements or metals can be carried by valuable and gangue minerals in the ore and that each mineral responds differently to a mineral processing method, the use of only elemental assays could lead to erroneous or uncertain conclusions on the process performance. This paper discusses the advantages of using performance indices calculated from minerals content, such as minerals recovery, for process performance assessments. A method is presented that uses elemental assays to estimate the minerals content of the solids in various process streams. The method combines the stoichiometric composition of the minerals and constraints of mass conservation for the minerals through the concentration process to estimate the minerals content from elemental assays. The advantage of assessing a concentration process using mineral based performance indices is illustrated for an iron ore concentration circuit.

Keywords: data reconciliation, iron ore concentration, mineral composition, process performance assessment

Procedia PDF Downloads 211
15443 Machine learning Assisted Selective Emitter design for Solar Thermophotovoltaic System

Authors: Ambali Alade Odebowale, Andargachew Mekonnen Berhe, Haroldo T. Hattori, Andrey E. Miroshnichenko

Abstract:

Solar thermophotovoltaic systems (STPV) have emerged as a promising solution to overcome the Shockley-Queisser limit, a significant impediment in the direct conversion of solar radiation into electricity using conventional solar cells. The STPV system comprises essential components such as an optical concentrator, selective emitter, and a thermophotovoltaic (TPV) cell. The pivotal element in achieving high efficiency in an STPV system lies in the design of a spectrally selective emitter or absorber. Traditional methods for designing and optimizing selective emitters are often time-consuming and may not yield highly selective emitters, posing a challenge to the overall system performance. In recent years, the application of machine learning techniques in various scientific disciplines has demonstrated significant advantages. This paper proposes a novel nanostructure composed of four-layered materials (SiC/W/SiO2/W) to function as a selective emitter in the energy conversion process of an STPV system. Unlike conventional approaches widely adopted by researchers, this study employs a machine learning-based approach for the design and optimization of the selective emitter. Specifically, a random forest algorithm (RFA) is employed for the design of the selective emitter, while the optimization process is executed using genetic algorithms. This innovative methodology holds promise in addressing the challenges posed by traditional methods, offering a more efficient and streamlined approach to selective emitter design. The utilization of a machine learning approach brings several advantages to the design and optimization of a selective emitter within the STPV system. Machine learning algorithms, such as the random forest algorithm, have the capability to analyze complex datasets and identify intricate patterns that may not be apparent through traditional methods. This allows for a more comprehensive exploration of the design space, potentially leading to highly efficient emitter configurations. Moreover, the application of genetic algorithms in the optimization process enhances the adaptability and efficiency of the overall system. Genetic algorithms mimic the principles of natural selection, enabling the exploration of a diverse range of emitter configurations and facilitating the identification of optimal solutions. This not only accelerates the design and optimization process but also increases the likelihood of discovering configurations that exhibit superior performance compared to traditional methods. In conclusion, the integration of machine learning techniques in the design and optimization of a selective emitter for solar thermophotovoltaic systems represents a groundbreaking approach. This innovative methodology not only addresses the limitations of traditional methods but also holds the potential to significantly improve the overall performance of STPV systems, paving the way for enhanced solar energy conversion efficiency.

Keywords: emitter, genetic algorithm, radiation, random forest, thermophotovoltaic

Procedia PDF Downloads 59
15442 Optimization of Alkali Silicate Glass Heat Treatment for the Improvement of Thermal Expansion and Flexural Strength

Authors: Stephanie Guerra-Arias, Stephani Nevarez, Calvin Stewart, Rachel Grodsky, Denis Eichorst

Abstract:

The objective of this study is to describe the framework for optimizing the heat treatment of alkali silicate glasses, to enhance the performance of hermetic seals in extreme environments. When connectors are exposed to elevated temperatures, residual stresses develop due to the mismatch of thermal expansions between the glass, metal pin, and metal shell. Excessive thermal expansion mismatch compromises the reliability of hermetic seals. In this study, a series of heat treatment schedules will be performed on two commercial sealing glasses (one conventional sealing glass and one crystallizable sealing glass) using a design of experiments (DOE) approach. The coefficient of thermal expansion (CTE) will be measured pre- and post-heat treatment using thermomechanical analysis (TMA). Afterwards, the flexural strength of the specimen will be measured using a four-point bend fixture mounted in a static universal testing machine. The measured material properties will be statistically analyzed using MiniTab software to determine which factors of the heat treatment process have a strong correlation to the coefficient of thermal expansion and/or flexural strength. Finally, a heat-treatment will be designed and tested to ensure the optimal performance of the hermetic seals in connectors.

Keywords: glass-ceramics, design of experiment, hermetic connectors, material characterization

Procedia PDF Downloads 146
15441 Study the Effect of Friction on Barreling Behavior during Upsetting Process Using Anand Model

Authors: H. Mohammadi Majd, M. Jalali Azizpour, V. Tavaf, A. Jaderi

Abstract:

In upsetting processes contact friction significantly influence metal flow, stress-strain state and process parameters. Furthermore, tribological conditions influence workpiece deformation and its dimensional precision. A viscoplastic constitutive law, the Anand model, was applied to represent the inelastic deformation behavior in upsetting process. This paper presents research results of the influence of contact friction coefficient on a workpiece deformation in upsetting process.finite element parameters. This technique was tested for three different specimens simulations of the upsetting and the corresponding material and can be successfully employed to predict the deformation of the upsetting process.

Keywords: friction, upsetting, barreling, Anand model

Procedia PDF Downloads 330
15440 Non-Dominated Sorting Genetic Algorithm (NSGA-II) for the Redistricting Problem in Mexico

Authors: Antonin Ponsich, Eric Alfredo Rincon Garcia, Roman Anselmo Mora Gutierrez, Miguel Angel Gutierrez Andrade, Sergio Gerardo De Los Cobos Silva, Pedro Lara Velzquez

Abstract:

The electoral zone design problem consists in redrawing the boundaries of legislative districts for electoral purposes in such a way that federal or state requirements are fulfilled. In Mexico, this process has been historically carried out by the National Electoral Institute (INE), by optimizing an integer nonlinear programming model, in which population equality and compactness of the designed districts are considered as two conflicting objective functions, while contiguity is included as a hard constraint. The solution technique used by the INE is a Simulated Annealing (SA) based algorithm, which handles the multi-objective nature of the problem through an aggregation function. The present work represents the first intent to apply a classical Multi-Objective Evolutionary Algorithm (MOEA), the second version of the Non-dominated Sorting Genetic Algorithm (NSGA-II), to this hard combinatorial problem. First results show that, when compared with the SA algorithm, the NSGA-II obtains promising results. The MOEA manages to produce well-distributed solutions over a wide-spread front, even though some convergence troubles for some instances constitute an issue, which should be corrected in future adaptations of MOEAs to the redistricting problem.

Keywords: multi-objective optimization, NSGA-II, redistricting, zone design problem

Procedia PDF Downloads 364
15439 Finite Volume Method Simulations of GaN Growth Process in MOVPE Reactor

Authors: J. Skibinski, P. Caban, T. Wejrzanowski, K. J. Kurzydlowski

Abstract:

In the present study, numerical simulations of heat and mass transfer during gallium nitride growth process in Metal Organic Vapor Phase Epitaxy reactor AIX-200/4RF-S is addressed. Existing knowledge about phenomena occurring in the MOVPE process allows to produce high quality nitride based semiconductors. However, process parameters of MOVPE reactors can vary in certain ranges. Main goal of this study is optimization of the process and improvement of the quality of obtained crystal. In order to investigate this subject a series of computer simulations have been performed. Numerical simulations of heat and mass transfer in GaN epitaxial growth process have been performed to determine growth rate for various mass flow rates and pressures of reagents. According to the fact that it’s impossible to determine experimentally the exact distribution of heat and mass transfer inside the reactor during the process, modeling is the only solution to understand the process precisely. Main heat transfer mechanisms during MOVPE process are convection and radiation. Correlation of modeling results with the experiment allows to determine optimal process parameters for obtaining crystals of highest quality.

Keywords: Finite Volume Method, semiconductors, epitaxial growth, metalorganic vapor phase epitaxy, gallium nitride

Procedia PDF Downloads 392
15438 Exploratory Study to Obtain a Biolubricant Base from Transesterified Oils of Animal Fats (Tallow)

Authors: Carlos Alfredo Camargo Vila, Fredy Augusto Avellaneda Vargas, Debora Alcida Nabarlatz

Abstract:

Due to the current need to implement environmentally friendly technologies, the possibility of using renewable raw materials to produce bioproducts such as biofuels, or in this case, to produce biolubricant bases, from residual oils (tallow), originating has been studied of the bovine industry. Therefore, it is hypothesized that through the study and control of the operating variables involved in the reverse transesterification method, a biolubricant base with high performance is obtained on a laboratory scale using animal fats from the bovine industry as raw materials, as an alternative for material recovery and environmental benefit. To implement this process, esterification of the crude tallow oil must be carried out in the first instance, which allows the acidity index to be decreased ( > 1 mg KOH/g oil), this by means of an acid catalysis with sulfuric acid and methanol, molar ratio 7.5:1 methanol: tallow, 1.75% w/w catalyst at 60°C for 150 minutes. Once the conditioning has been completed, the biodiesel is continued to be obtained from the improved sebum, for which an experimental design for the transesterification method is implemented, thus evaluating the effects of the variables involved in the process such as the methanol molar ratio: improved sebum and catalyst percentage (KOH) over methyl ester content (% FAME). Finding that the highest percentage of FAME (92.5%) is given with a 7.5:1 methanol: improved tallow ratio and 0.75% catalyst at 60°C for 120 minutes. And although the% FAME of the biodiesel produced does not make it suitable for commercialization, it does ( > 90%) for its use as a raw material in obtaining biolubricant bases. Finally, once the biodiesel is obtained, an experimental design is carried out to obtain biolubricant bases using the reverse transesterification method, which allows the study of the effects of the biodiesel: TMP (Trimethylolpropane) molar ratio and the percentage of catalyst on viscosity and yield as response variables. As a result, a biolubricant base is obtained that meets the requirements of ISO VG (Classification for industrial lubricants according to ASTM D 2422) 32 (viscosity and viscosity index) for commercial lubricant bases, using a 4:1 biodiesel molar ratio: TMP and 0.51% catalyst at 120°C, at a pressure of 50 mbar for 180 minutes. It is necessary to highlight that the product obtained consists of two phases, a liquid and a solid one, being the first object of study, and leaving the classification and possible application of the second one incognito. Therefore, it is recommended to carry out studies of the greater depth that allows characterizing both phases, as well as improving the method of obtaining by optimizing the variables involved in the process and thus achieving superior results.

Keywords: biolubricant base, bovine tallow, renewable resources, reverse transesterification

Procedia PDF Downloads 112
15437 OLED Encapsulation Process Using Low Melting Point Alloy and Epoxy Mixture by Instantaneous Discharge

Authors: Kyung Min Park, Cheol Hee Moon

Abstract:

In this study we are to develop a sealing process using a mixture of a LMPA and an epoxy for the atmospheric OLED sealing process as a substitute for the thin-film process. Electrode lines were formed on the substrates, which were covered with insulating layers and sacrificial layers. A mixture of a LMPA and an epoxy was screen printed between the two electrodes. In order to generate a heat for the melting of the mixture, Joule heating method was used. Were used instantaneous discharge process for generating Joule heating. Experimental conditions such as voltage, time and constituent of the electrode were varied to optimize the heating conditions. As a result, the mixture structure of this study showed a great potential for a low-cost, low-temperature, atmospheric OLED sealing process as a substitute for the thin-film process.

Keywords: organic light emitting diode, encapsulation, low melting point alloy, joule heat

Procedia PDF Downloads 542
15436 Optimization of Surface Roughness in Additive Manufacturing Processes via Taguchi Methodology

Authors: Anjian Chen, Joseph C. Chen

Abstract:

This paper studies a case where the targeted surface roughness of fused deposition modeling (FDM) additive manufacturing process is improved. The process is designing to reduce or eliminate the defects and improve the process capability index Cp and Cpk for an FDM additive manufacturing process. The baseline Cp is 0.274 and Cpk is 0.654. This research utilizes the Taguchi methodology, to eliminate defects and improve the process. The Taguchi method is used to optimize the additive manufacturing process and printing parameters that affect the targeted surface roughness of FDM additive manufacturing. The Taguchi L9 orthogonal array is used to organize the parameters' (four controllable parameters and one non-controllable parameter) effectiveness on the FDM additive manufacturing process. The four controllable parameters are nozzle temperature [°C], layer thickness [mm], nozzle speed [mm/s], and extruder speed [%]. The non-controllable parameter is the environmental temperature [°C]. After the optimization of the parameters, a confirmation print was printed to prove that the results can reduce the amount of defects and improve the process capability index Cp from 0.274 to 1.605 and the Cpk from 0.654 to 1.233 for the FDM additive manufacturing process. The final results confirmed that the Taguchi methodology is sufficient to improve the surface roughness of FDM additive manufacturing process.

Keywords: additive manufacturing, fused deposition modeling, surface roughness, six-sigma, Taguchi method, 3D printing

Procedia PDF Downloads 382
15435 Tectonics in Sustainable Contemporary Architecture: An Approach to the Intersection between Design and Construction in the Work of Norman Foster

Authors: Mafalda Fabiene Ferreira Pantoja, Joao Da Costa Pantoja, Rui Humberto Costa De Fernandes Povoas

Abstract:

The present paper seeks to present a theoretical and practical reflection about examples of contemporary architecture in the world context where concerns about the planet become prominent and increasingly necessary. Firstly, a brief introduction will be made on the conceptual principles of tectonics in architecture in order to apply such concepts in a perspective of analysis of the intersection between design and construction in contemporary examples of Norman Foster’s architecture, once his work has demonstrated attitudes of composition that concerns about the place, technology, materials, and building life. Foster's compositions are usually focused on the role of technology in the process of architectural design, making his works a mixture of place, program, construction, and formal structures. The main purpose of the present paper is the reflection on the tools of theoretical and practical analysis about tectonics, optimizing the resources that allow cultural anchoring and creation of identity. Also establishing relation between resources, building life cycle and employment of correct materials, in order to find out how the tectonic concept can elevate the status of contemporary architecture, making it qualitative in a more sustainable context and adapted to current needs.

Keywords: contemporary architecture, norman foster, tectonic, sustainable architecture

Procedia PDF Downloads 116
15434 Challenges, Chances and Possibilities during the Change Management Process of the National Defence Academy Vienna

Authors: Georg Ebner

Abstract:

The National Defence Academy, an element of the Austrian Ministry of Defence, is undergoing a transition process leading the Academy towards a new target structure that is currently being developed. In so doing, in addition to a subject-oriented approach, also an employee-oriented process was introduced. This process was initiated by the Ministry of Defence and should lead the National Defence Academy into a new constellation. During this process, the National Defence Academy worked in very special adapted World Café sessions. The “change manager” dealed with very different issues. They took the data feedback from the sessions and prepared with the feedback and information from the guidance the next session. So they got various information and a very different picture around the academy. It was very helpful to involve most of the employees of the academy during this process and to take their knowledge and wisdom. The process himself started with very different feelings and ended with great consent. A very interesting part of this process was also that the commander and his deputy worked together during all of this sessions and they answered all questions from the employees in time. The adapted World Café phases were necessary to deal with the information of the staff and to implement these absolutely needful data into this process. In cooperation with the responsible Headquarters, the first items resulting from the World Café phases could already be fed back to the employees and be implemented. The staff-oriented process is currently supported via a point of contact, through which the staff can contribute ideas as well, but also by the active information policy on the part of the Headquarters. The described change process makes innovative innovations possible. So far, in the event of change processes staff members have been entrusted only with the concrete implementation plan and tied into the process when the respective workplaces were to be re-staffed. The procedure described here can be seen as food-for-thought for further change processes. The findings of this process are that a staff oriented process can lead an organisation into a new era of thinking and working. This process has shown, that a lot of innovative ideas can also take place in a ministry. This process can be a background for a lot of change management processes in ministries and governmental and non-governmental organisations.

Keywords: both directions approach, change management, knowledge database, transformation process, World Cafe

Procedia PDF Downloads 189
15433 Importance of New Policies of Process Management for Internet of Things Based on Forensic Investigation

Authors: Venkata Venugopal Rao Gudlur

Abstract:

The Proposed Policies referred to as “SOP”, on the Internet of Things (IoT) based Forensic Investigation into Process Management is the latest revolution to save time and quick solution for investigators. The forensic investigation process has been developed over many years from time to time it has been given the required information with no policies in investigation processes. This research reveals that the current IoT based forensic investigation into Process Management based is more connected to devices which is the latest revolution and policies. All future development in real-time information on gathering monitoring is evolved with smart sensor-based technologies connected directly to IoT. This paper present conceptual framework on process management. The smart devices are leading the way in terms of automated forensic models and frameworks established by different scholars. These models and frameworks were mostly focused on offering a roadmap for performing forensic operations with no policies in place. These initiatives would bring a tremendous benefit to process management and IoT forensic investigators proposing policies. The forensic investigation process may enhance more security and reduced data losses and vulnerabilities.

Keywords: Internet of Things, Process Management, Forensic Investigation, M2M Framework

Procedia PDF Downloads 97
15432 A Resilience Process Model of Natural Gas Pipeline Systems

Authors: Zhaoming Yang, Qi Xiang, Qian He, Michael Havbro Faber, Enrico Zio, Huai Su, Jinjun Zhang

Abstract:

Resilience is one of the key factors for system safety assessment and optimization, and resilience studies of natural gas pipeline systems (NGPS), especially in terms of process descriptions, are still being explored. Based on the three main stages, which are function loss process, recovery process, and waiting process, the paper has built functions and models which are according to the practical characteristics of NGPS and mainly analyzes the characteristics of deterministic interruptions. The resilience of NGPS also considers the threshold of the system function or users' satisfaction. The outcomes, which quantify the resilience of NGPS in different evaluation views, can be combined with the max flow and shortest path methods, help with the optimization of extra gas supplies and gas routes as well as pipeline maintenance strategies, the quick analysis of disturbance effects and the improvement of NGPS resilience evaluation accuracy.

Keywords: natural gas pipeline system, resilience, process modeling, deterministic disturbance

Procedia PDF Downloads 119
15431 State of Art in Software Requirement Negotiation Process Models

Authors: Shamsu Abdullahi, Nazir Yusuf, Hazrina Sofian, Abubakar Zakari, Amina Nura, Salisu Suleiman

Abstract:

Requirements negotiation process models help in resolving conflicting requirements of the heterogeneous stakeholders in the software development industry. This is to achieve a shared vision of software projects to be developed by the industry. Negotiating stakeholder agreements is a serious and difficult task in the software development process. There are many requirements negotiation process models that effectively negotiate stakeholder agreements that have been proposed by the research community. Other issues in the requirements negotiation research domain include stakeholder communication, decision-making, lack of negotiation interoperability, and managing requirement changes and analysis. This study highlights the current state of the art in the existing software requirements negotiation process models. The study also describes the issues and limitations in the software requirements negotiations process models.

Keywords: requirements, negotiation, stakeholders, agreements

Procedia PDF Downloads 192
15430 Issues on Optimizing the Structural Parameters of the Induction Converter

Authors: Marinka K. Baghdasaryan, Siranush M. Muradyan, Avgen A. Gasparyan

Abstract:

Analytical expressions of the current and angular errors, as well as the frequency characteristics of an induction converter describing the relation with its structural parameters, the core and winding characteristics are obtained. Based on estimation of the dependences obtained, a mathematical problem of parametric optimization is formulated which can successfully be used for investigation and diagnosing an induction converter.

Keywords: induction converters, magnetic circuit material, current and angular errors, frequency response, mathematical formulation, structural parameters

Procedia PDF Downloads 341
15429 Fabrication of Silicon Solar Cells Using All Sputtering Process

Authors: Ching-Hua Li, Sheng-Hui Chen

Abstract:

Sputtering is a popular technique with many advantages for thin film deposition. To fabricate a hydrogenated silicon thin film using sputtering process for solar cell applications, the ion bombardment during sputtering will generate microstructures (voids and columnar structures) to form silicon dihydride bodings as defects. The properties of heterojunction silicon solar cells were studied by using boron grains and silicon-boron targets. Finally, an 11.7% efficiency of solar cell was achieved by using all sputtering process.

Keywords: solar cell, sputtering process, pvd, alloy target

Procedia PDF Downloads 576
15428 Compressed Natural Gas (CNG) Injector Research for Dual Fuel Engine

Authors: Adam Majczak, Grzegorz Barański, Marcin Szlachetka

Abstract:

Environmental considerations necessitate the search for new energy sources. One of the available solutions is a partial replacement of diesel fuel by compressed natural gas (CNG) in the compression ignition engines. This type of the engines is used mainly in vans and trucks. These units are also gaining more and more popularity in the passenger car market. In Europe, this part of the market share reaches 50%. Diesel engines are also used in industry in such vehicles as ship or locomotives. Diesel engines have higher emissions of nitrogen oxides in comparison to spark ignition engines. This can be currently limited by optimizing the combustion process and the use of additional systems such as exhaust gas recirculation or AdBlue technology. As a result of the combustion process of diesel fuel also particulate matter (PM) that are harmful to the human health are emitted. Their emission is limited by the use of a particulate filter. One of the method for toxic components emission reduction may be the use of liquid gas fuel such as propane and butane (LPG) or compressed natural gas (CNG). In addition to the environmental aspects, there are also economic reasons for the use of gaseous fuels to power diesel engines. A total or partial replacement of diesel gas is possible. Depending on the used technology and the percentage of diesel fuel replacement, it is possible to reduce the content of nitrogen oxides in the exhaust gas even by 30%, particulate matter (PM) by 95 % carbon monoxide and by 20%, in relation to original diesel fuel. The research object is prototype gas injector designed for direct injection of compressed natural gas (CNG) in compression ignition engines. The construction of the injector allows for it positioning in the glow plug socket, so that the gas is injected directly into the combustion chamber. The cycle analysis of the four-cylinder Andoria ADCR engine with a capacity of 2.6 dm3 for different crankshaft rotational speeds allowed to determine the necessary time for fuel injection. Because of that, it was possible to determine the required mass flow rate of the injector, for replacing as much of the original fuel by gaseous fuel. To ensure a high value of flow inside the injector, supply pressure equal to 1 MPa was applied. High gas supply pressure requires high value of valve opening forces. For this purpose, an injector with hydraulic control system, using a liquid under pressure for the opening process was designed. On the basis of air pressure measurements in the flow line after the injector, the analysis of opening and closing of the valve was made. Measurements of outflow mass of the injector were also carried out. The results showed that the designed injector meets the requirements necessary to supply ADCR engine by the CNG fuel.

Keywords: CNG, diesel engine, gas flow, gas injector

Procedia PDF Downloads 483
15427 Destination Port Detection For Vessels: An Analytic Tool For Optimizing Port Authorities Resources

Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin

Abstract:

Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/ unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages AIS messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring Automatic Identification System (AIS) messages. Our RRoT method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measure to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Fr´echet Distance (DFD), Dynamic Time Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an fmeasure of 99.08% using Dynamic Time Warping (DTW) similarity measure.

Keywords: spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization

Procedia PDF Downloads 115
15426 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads

Authors: Gaurav Kumar Sinha

Abstract:

In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.

Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies

Procedia PDF Downloads 63
15425 Understanding the Importance of Participation in the City Planning Process and Its Influencing Factors

Authors: Louis Nwachi

Abstract:

Urban planning systems in most countries still rely on expert-driven, top-down technocratic plan-making processes rather than a public and people-led process. This paper set out to evaluate the need for public participation in the plan-making process and to highlight the factors that affect public participation in the plan-making process. In doing this, it adopted a qualitative approach based on document review and interviews taken from real-world phenomena. A case study strategy using the Metropolitan Area of Abuja, the capital of Nigeria, as the study sample was used in carrying out the research. The research finds that participation is an important tool in the plan-making process and that public engagement in the process contributes to the identification of key urban issues that are unique to the specific local areas, thereby contributing to the establishment of priorities and, in turn, to the mobilization of resources to meet the identified needs. It also finds that the development of a participation model by city authorities encourages public engagement and helps to develop trust between those in authority and the different key stakeholder groups involved in the plan-making process.

Keywords: plan-making, participation, urban planning, city

Procedia PDF Downloads 97
15424 Quality Based Approach for Efficient Biologics Manufacturing

Authors: Takashi Kaminagayoshi, Shigeyuki Haruyama

Abstract:

To improve the manufacturing efficiency of biologics, such as antibody drugs, a quality engineering framework was designed. Within this framework, critical steps and parameters in the manufacturing process were studied. Identification of these critical steps and critical parameters allows a deeper understanding of manufacturing capabilities, and suggests to process development department process control standards based on actual manufacturing capabilities as part of a PDCA (plan-do-check-act) cycle. This cycle can be applied to each manufacturing process so that it can be standardized, reducing the time needed to establish each new process.

Keywords: antibody drugs, biologics, manufacturing efficiency, PDCA cycle, quality engineering

Procedia PDF Downloads 340
15423 Flowsheet Development, Simulation and Optimization of Carbon-Di-Oxide Removal System at Natural Gas Reserves by Aspen–Hysys Process Simulator

Authors: Mohammad Ruhul Amin, Nusrat Jahan

Abstract:

Natural gas is a cleaner fuel compared to the others. But it needs some treatment before it is in a state to be used. So natural gas purification is an integral part of any process where natural gas is used as raw material or fuel. There are several impurities in natural gas that have to be removed before use. CO2 is one of the major contaminants. In this project we have removed CO2 by amine process by using MEA solution. We have built up the whole amine process for removing CO2 in Aspen Hysys and simulated the process. At the end of simulation we have got very satisfactory results by using MEA solution for the removal of CO2. Simulation result shows that amine absorption process enables to reduce CO2 content from NG by 58%. HYSYS optimizer allowed us to get a perfect optimized plant. After optimization the profit of existing plant is increased by 2.34 %.Simulation and optimization by Aspen-HYSYS simulator makes available us to enormous information which will help us to further research in future.

Keywords: Aspen–Hysys, CO2 removal, flowsheet development, MEA solution, natural gas optimization

Procedia PDF Downloads 493
15422 Experimental Investigation on Freeze-Concentration Process Desalting for Highly Saline Brines

Authors: H. Al-Jabli

Abstract:

Using the freeze-melting process for the disposing of high saline brines was the aim of the paper by confirming the performance estimation of the treatment system. A laboratory bench scale freezing technique test unit was designed, constructed, and tested at Doha Research Plant (DRP) in Kuwait. The principal unit operations that have been considered for the laboratory study are: ice crystallization, separation, washing, and melting. The applied process is characterized as “the secondary-refrigerant indirect freezing”, which is utilizing normal freezing concept. The high saline brine was used as definite feed water, i.e. average TDS of 250,000 ppm. Kuwait desalination plants were carried out in the experimental study to measure the performance of the proposed treatment system. Experimental analysis shows that the freeze-melting process is capable of dropping the TDS of the feed water from 249,482 ppm to 56,880 ppm of the freeze-melting process in the two-phase’s course, whereas overall recovery results of the salt passage and salt rejection are 31.11%, 19.05%, and 80.95%, correspondingly. Therefore, the freeze-melting process is encouraging for the proposed application, as it shows on the results, which approves the process capability of reducing a major amount of the dissolved salts of the high saline brine with reasonable sensible recovery. This process might be reasonable with other brine disposal processes.

Keywords: high saline brine, freeze-melting process, ice crystallization, brine disposal process

Procedia PDF Downloads 263
15421 Evaluation of Reliability, Availability and Maintainability for Automotive Manufacturing Process

Authors: Hamzeh Soltanali, Abbas Rohani, A. H. S. Garmabaki, Mohammad Hossein Abbaspour-Fard, Adithya Thaduri

Abstract:

Toward continuous innovation and high complexity of technological systems, the automotive manufacturing industry is also under pressure to implement adequate management strategies regarding availability and productivity. In this context, evaluation of system’s performance by considering reliability, availability and maintainability (RAM) methodologies can constitute for resilient operation, identifying the bottlenecks of manufacturing process and optimization of maintenance actions. In this paper, RAM parameters are evaluated for improving the operational performance of the fluid filling process. To evaluate the RAM factors through the behavior of states defined for such process, a systematic decision framework was developed. The results of RAM analysis revealed that that the improving reliability and maintainability of main bottlenecks for each filling workstation need to be considered as a priority. The results could be useful to improve operational performance and sustainability of production process.

Keywords: automotive, performance, reliability, RAM, fluid filling process

Procedia PDF Downloads 350