Search results for: Process models
19425 Process Mining as an Ecosystem Platform to Mitigate a Deficiency of Processes Modelling
Authors: Yusra Abdulsalam Alqamati, Ahmed Alkilany
Abstract:
The teaching staff is a distinct group whose impact is on the educational process and which plays an important role in enhancing the quality of the academic education process. To improve the management effectiveness of the academy, the Teaching Staff Management System (TSMS) proposes that all teacher processes be digitized. Since the BPMN approach can accurately describe the processes, it lacks a clear picture of the process flow map, something that the process mining approach has, which is extracting information from event logs for discovery, monitoring, and model enhancement. Therefore, these two methodologies were combined to create the most accurate representation of system operations, the ability to extract data records and mining processes, recreate them in the form of a Petri net, and then generate them in a BPMN model for a more in-depth view of process flow. Additionally, the TSMS processes will be orchestrated to handle all requests in a guaranteed small-time manner thanks to the integration of the Google Cloud Platform (GCP), the BPM engine, and allowing business owners to take part throughout the entire TSMS project development lifecycle.Keywords: process mining, BPM, business process model and notation, Petri net, teaching staff, Google Cloud Platform
Procedia PDF Downloads 14119424 42CrMo4 Steel Flow Behavior Characterization for High Temperature Closed Dies Hot Forging in Automotive Components Applications
Authors: O. Bilbao, I. Loizaga, F. A. Girot, A. Torregaray
Abstract:
The current energetical situation and the high competitiveness in industrial sectors as the automotive one have become the development of new manufacturing processes with less energy and raw material consumption a real necessity. As consequence, new forming processes related with high temperature hot forging in closed dies have emerged in the last years as new solutions to expand the possibilities of hot forging and iron casting in the automotive industry. These technologies are mid-way between hot forging and semi-solid metal processes, working at temperatures higher than the hot forging but below the solidus temperature or the semi solid range, where no liquid phase is expected. This represents an advantage comparing with semi-solid forming processes as thixoforging, by the reason that no so high temperatures need to be reached in the case of high melting point alloys as steels, reducing the manufacturing costs and the difficulties associated to semi-solid processing of them. Comparing with hot forging, this kind of technologies allow the production of parts with as forged properties and more complex and near-net shapes (thinner sidewalls), enhancing the possibility of designing lightweight components. From the process viewpoint, the forging forces are significantly decreased, and a significant reduction of the raw material, energy consumption, and the forging steps have been demonstrated. Despite the mentioned advantages, from the material behavior point of view, the expansion of these technologies has shown the necessity of developing new material flow behavior models in the process working temperature range to make the simulation or the prediction of these new forming processes feasible. Moreover, the knowledge of the material flow behavior at the working temperature range also allows the design of the new closed dies concept required. In this work, the flow behavior characterization in the mentioned temperature range of the widely used in automotive commercial components 42CrMo4 steel has been studied. For that, hot compression tests have been carried out in a thermomechanical tester in a temperature range that covers the material behavior from the hot forging until the NDT (Nil Ductility Temperature) temperature (1250 ºC, 1275 ºC, 1300 ºC, 1325 ºC, 1350ºC, and 1375 ºC). As for the strain rates, three different orders of magnitudes have been considered (0,1 s-1, 1s-1, and 10s-1). Then, results obtained from the hot compression tests have been treated in order to adapt or re-write the Spittel model, widely used in automotive commercial softwares as FORGE® that restrict the current existing models up to 1250ºC. Finally, the obtained new flow behavior model has been validated by the process simulation in a commercial automotive component and the comparison of the results of the simulation with the already made experimental tests in a laboratory cellule of the new technology. So as a conclusion of the study, a new flow behavior model for the 42CrMo4 steel in the new working temperature range and the new process simulation in its application in automotive commercial components has been achieved and will be shown.Keywords: 42CrMo4 high temperature flow behavior, high temperature hot forging in closed dies, simulation of automotive commercial components, spittel flow behavior model
Procedia PDF Downloads 12919423 Kinetic Modeling of Transesterification of Triacetin Using Synthesized Ion Exchange Resin (SIERs)
Authors: Hafizuddin W. Yussof, Syamsutajri S. Bahri, Adam P. Harvey
Abstract:
Strong anion exchange resins with QN+OH-, have the potential to be developed and employed as heterogeneous catalyst for transesterification, as they are chemically stable to leaching of the functional group. Nine different SIERs (SIER1-9) with QN+OH- were prepared by suspension polymerization of vinylbenzyl chloride-divinylbenzene (VBC-DVB) copolymers in the presence of n-heptane (pore-forming agent). The amine group was successfully grafted into the polymeric resin beads through functionalization with trimethylamine. These SIERs are then used as a catalyst for the transesterification of triacetin with methanol. A set of differential equations that represents the Langmuir-Hinshelwood-Hougen-Watson (LHHW) and Eley-Rideal (ER) models for the transesterification reaction were developed. These kinetic models of LHHW and ER were fitted to the experimental data. Overall, the synthesized ion exchange resin-catalyzed reaction were well-described by the Eley-Rideal model compared to LHHW models, with sum of square error (SSE) of 0.742 and 0.996, respectively.Keywords: anion exchange resin, Eley-Rideal, Langmuir-Hinshelwood-Hougen-Watson, transesterification
Procedia PDF Downloads 36119422 Automating 2D CAD to 3D Model Generation Process: Wall pop-ups
Authors: Mohit Gupta, Chialing Wei, Thomas Czerniawski
Abstract:
In this paper, we have built a neural network that can detect walls on 2D sheets and subsequently create a 3D model in Revit using Dynamo. The training set includes 3500 labeled images, and the detection algorithm used is YOLO. Typically, engineers/designers make concentrated efforts to convert 2D cad drawings to 3D models. This costs a considerable amount of time and human effort. This paper makes a contribution in automating the task of 3D walls modeling. 1. Detecting Walls in 2D cad and generating 3D pop-ups in Revit. 2. Saving designer his/her modeling time in drafting elements like walls from 2D cad to 3D representation. An object detection algorithm YOLO is used for wall detection and localization. The neural network is trained over 3500 labeled images of size 256x256x3. Then, Dynamo is interfaced with the output of the neural network to pop-up 3D walls in Revit. The research uses modern technological tools like deep learning and artificial intelligence to automate the process of generating 3D walls without needing humans to manually model them. Thus, contributes to saving time, human effort, and money.Keywords: neural networks, Yolo, 2D to 3D transformation, CAD object detection
Procedia PDF Downloads 14419421 Simulation of the Large Hadrons Collisions Using Monte Carlo Tools
Authors: E. Al Daoud
Abstract:
In many cases, theoretical treatments are available for models for which there is no perfect physical realization. In this situation, the only possible test for an approximate theoretical solution is to compare with data generated from a computer simulation. In this paper, Monte Carlo tools are used to study and compare the elementary particles models. All the experiments are implemented using 10000 events, and the simulated energy is 13 TeV. The mean and the curves of several variables are calculated for each model using MadAnalysis 5. Anomalies in the results can be seen in the muons masses of the minimal supersymmetric standard model and the two Higgs doublet model.Keywords: Feynman rules, hadrons, Lagrangian, Monte Carlo, simulation
Procedia PDF Downloads 31719420 The Effect of Oxidation Stability Improvement in Calophyllum Inophyllum Palm Oil Methyl Ester Production
Authors: Natalina, Hwai Chyuan Onga, W. T. Chonga
Abstract:
Oxidation stability of biodiesel is very important in fuel handling especially for remote location of biodiesel application. Variety of feedstocks and biodiesel production process resulted many variation of biodiesel oxidation stability. The current study relates to investigation of the impact of fatty acid composition that caused by natural and production process of calophyllum inophyllum palm oil methyl ester that correlated with improvement of biodiesel oxidation stability. Firstly, biodiesel was produced from crude oil of palm oil, calophyllum inophyllum and mixing of calophyllum inophyllum and palm oil. The production process of calophyllum inophyllum palm oil methyl ester (CIPOME) was divided by including washing process and without washing. Secondly, the oxidation stability was measured from the palm oil methyl ester (POME), calophyllum inophyllum methyl ester (CIME), CIPOME with washing process and CIPOME without washing process. Then, in order to find the differences of fatty acid compositions all of the biodiesels were measured by gas chromatography analysis. It was found that mixing calophyllum inophyllum into palm oil increased the oxidation stability. Washing process influenced the CIPOME fatty acid composition, and reduction of washing process during the production process gave significant oxidation stability number of CIPOME (38 h to 114 h).Keywords: biodiesel, oxidation stability, calophyllum inophyllum, water content
Procedia PDF Downloads 27019419 Using Machine Learning to Predict Answers to Big-Five Personality Questions
Authors: Aadityaa Singla
Abstract:
The big five personality traits are as follows: openness, conscientiousness, extraversion, agreeableness, and neuroticism. In order to get an insight into their personality, many flocks to these categories, which each have different meanings/characteristics. This information is important not only to individuals but also to career professionals and psychologists who can use this information for candidate assessment or job recruitment. The links between AI and psychology have been well studied in cognitive science, but it is still a rather novel development. It is possible for various AI classification models to accurately predict a personality question via ten input questions. This would contrast with the hundred questions that normal humans have to answer to gain a complete picture of their five personality traits. In order to approach this problem, various AI classification models were used on a dataset to predict what a user may answer. From there, the model's prediction was compared to its actual response. Normally, there are five answer choices (a 20% chance of correct guess), and the models exceed that value to different degrees, proving their significance. By utilizing an MLP classifier, decision tree, linear model, and K-nearest neighbors, they were able to obtain a test accuracy of 86.643, 54.625, 47.875, and 52.125, respectively. These approaches display that there is potential in the future for more nuanced predictions to be made regarding personality.Keywords: machine learning, personally, big five personality traits, cognitive science
Procedia PDF Downloads 14519418 Visualization and Performance Measure to Determine Number of Topics in Twitter Data Clustering Using Hybrid Topic Modeling
Authors: Moulana Mohammed
Abstract:
Topic models are widely used in building clusters of documents for more than a decade, yet problems occurring in choosing optimal number of topics. The main problem is the lack of a stable metric of the quality of topics obtained during the construction of topic models. The authors analyzed from previous works, most of the models used in determining the number of topics are non-parametric and quality of topics determined by using perplexity and coherence measures and concluded that they are not applicable in solving this problem. In this paper, we used the parametric method, which is an extension of the traditional topic model with visual access tendency for visualization of the number of topics (clusters) to complement clustering and to choose optimal number of topics based on results of cluster validity indices. Developed hybrid topic models are demonstrated with different Twitter datasets on various topics in obtaining the optimal number of topics and in measuring the quality of clusters. The experimental results showed that the Visual Non-negative Matrix Factorization (VNMF) topic model performs well in determining the optimal number of topics with interactive visualization and in performance measure of the quality of clusters with validity indices.Keywords: interactive visualization, visual mon-negative matrix factorization model, optimal number of topics, cluster validity indices, Twitter data clustering
Procedia PDF Downloads 13419417 Timely Detection and Identification of Abnormalities for Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.Keywords: detection, monitoring, identification, measurement data, multivariate techniques
Procedia PDF Downloads 23619416 Efficiency-Based Model for Solar Urban Planning
Authors: M. F. Amado, A. Amado, F. Poggi, J. Correia de Freitas
Abstract:
Today it is widely understood that global energy consumption patterns are directly related to the ongoing urban expansion and development process. This expansion is based on the natural growth of human activities and has left most urban areas totally dependent on fossil fuel derived external energy inputs. This status-quo of production, transportation, storage and consumption of energy has become inefficient and is set to become even more so when the continuous increases in energy demand are factored in. The territorial management of land use and related activities is a central component in the search for more efficient models of energy use, models that can meet current and future regional, national and European goals. In this paper, a methodology is developed and discussed with the aim of improving energy efficiency at the municipal level. The development of this methodology is based on the monitoring of energy consumption and its use patterns resulting from the natural dynamism of human activities in the territory and can be utilized to assess sustainability at the local scale. A set of parameters and indicators are defined with the objective of constructing a systemic model based on the optimization, adaptation and innovation of the current energy framework and the associated energy consumption patterns. The use of the model will enable local governments to strike the necessary balance between human activities, economic development, and the local and global environment while safeguarding fairness in the energy sector.Keywords: solar urban planning, solar smart city, urban development, energy efficiency
Procedia PDF Downloads 32819415 Component-Based Approach in Assessing Sewer Manholes
Authors: Khalid Kaddoura, Tarek Zayed
Abstract:
Sewer networks are constructed to protect the communities and the environment from any contact with the sewer mediums. Pipelines, being laterals or sewer mains, and manholes form the huge underground infrastructure in every urban city. Due to the sewer networks importance, the infrastructure asset management field has extensive advancement in condition assessment and rehabilitation decision models. However, most of the focus was devoted to pipelines giving little attention toward manholes condition assessment. In fact, recent studies started to emerge in this area to preserve manholes from any malfunction. Therefore, the main objective of this study is to propose a condition assessment model for sewer manholes. The model divides the manhole into several components and determines the relative importance weight of each component using the Analytic Network Process (ANP) decision-making method. Later, the condition of the manhole is computed by aggregating the condition of each component with its corresponding weight. Accordingly, the proposed assessment model will enable decision-makers to have a final index suggesting the overall condition of the manhole and a backward analysis to check the condition of each component. Consequently, better decisions are made pertinent to maintenance, rehabilitation, and replacement actions.Keywords: Analytic Network Process (ANP), condition assessment, decision-making, manholes
Procedia PDF Downloads 35419414 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits
Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena
Abstract:
Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling
Procedia PDF Downloads 31219413 Decision Analysis Module for Excel
Authors: Radomir Perzina, Jaroslav Ramik
Abstract:
The Analytic Hierarchy Process is frequently used approach for solving decision making problems. There exists wide range of software programs utilizing that approach. Their main disadvantage is that they are relatively expensive and missing intermediate calculations. This work introduces a Microsoft Excel add-in called DAME – Decision Analysis Module for Excel. Comparing to other computer programs DAME is free, can work with scenarios or multiple decision makers and displays intermediate calculations. Users can structure their decision models into three levels – scenarios/users, criteria and variants. Items on all levels can be evaluated either by weights or pair-wise comparisons. There are provided three different methods for the evaluation of the weights of criteria, the variants as well as the scenarios – Saaty’s Method, Geometric Mean Method and Fuller’s Triangle Method. Multiplicative and additive syntheses are supported. The proposed software package is demonstrated on couple of illustrating examples of real life decision problems.Keywords: analytic hierarchy process, multi-criteria decision making, pair-wise comparisons, Microsoft Excel, scenarios
Procedia PDF Downloads 45219412 Methodologies, Systems Development Life Cycle and Modeling Languages in Agile Software Development
Authors: I. D. Arroyo
Abstract:
This article seeks to integrate different concepts from contemporary software engineering with an agile development approach. We seek to clarify some definitions and uses, we make a difference between the Systems Development Life Cycle (SDLC) and the methodologies, we differentiate the types of frameworks such as methodological, philosophical and behavioral, standards and documentation. We define relationships based on the documentation of the development process through formal and ad hoc models, and we define the usefulness of using DevOps and Agile Modeling as integrative methodologies of principles and best practices.Keywords: methodologies, modeling languages, agile modeling, UML
Procedia PDF Downloads 18519411 Modeling, Analysis, and Optimization of Process Parameters of Metal Spinning
Authors: B. Ravi Kumar, S. Gajanana, K. Hemachandra Reddy, K. Udayani
Abstract:
Physically into various derived shapes and sizes under the effect of externally applied forces. The spinning process is an advanced plastic working technology and is frequently used for manufacturing axisymmetric shapes. Over the last few decades, Sheet metal spinning has developed significantly and spun products have widely used in various industries. Nowadays the process has been expanded to new horizons in industries, since tendency to use minimum tool and equipment costs and also using lower forces with the output of excellent surface quality and good mechanical properties. The automation of the process is of greater importance, due to its wider applications like decorative household goods, rocket nose cones, gas cylinders, etc. This paper aims to gain insight into the conventional spinning process by employing experimental and numerical methods. The present work proposes an approach for optimizing process parameters are mandrel speed (rpm), roller nose radius (mm), thickness of the sheet (mm). Forming force, surface roughness and strain are the responses.in spinning of Aluminum (2024-T3) using DOE-Response Surface Methodology (RSM) and Analysis of variance (ANOVA). The FEA software is used for modeling and analysis. The process parameters considered in the experimentation.Keywords: FEA, RSM, process parameters, sheet metal spinning
Procedia PDF Downloads 31919410 Algorithms Inspired from Human Behavior Applied to Optimization of a Complex Process
Authors: S. Curteanu, F. Leon, M. Gavrilescu, S. A. Floria
Abstract:
Optimization algorithms inspired from human behavior were applied in this approach, associated with neural networks models. The algorithms belong to human behaviors of learning and cooperation and human competitive behavior classes. For the first class, the main strategies include: random learning, individual learning, and social learning, and the selected algorithms are: simplified human learning optimization (SHLO), social learning optimization (SLO), and teaching-learning based optimization (TLBO). For the second class, the concept of learning is associated with competitiveness, and the selected algorithms are sports-inspired algorithms (with Football Game Algorithm, FGA and Volleyball Premier League, VPL) and Imperialist Competitive Algorithm (ICA). A real process, the synthesis of polyacrylamide-based multicomponent hydrogels, where some parameters are difficult to obtain experimentally, is considered as a case study. Reaction yield and swelling degree are predicted as a function of reaction conditions (acrylamide concentration, initiator concentration, crosslinking agent concentration, temperature, reaction time, and amount of inclusion polymer, which could be starch, poly(vinyl alcohol) or gelatin). The experimental results contain 175 data. Artificial neural networks are obtained in optimal form with biologically inspired algorithm; the optimization being perform at two level: structural and parametric. Feedforward neural networks with one or two hidden layers and no more than 25 neurons in intermediate layers were obtained with values of correlation coefficient in the validation phase over 0.90. The best results were obtained with TLBO algorithm, correlation coefficient being 0.94 for an MLP(6:9:20:2) – a feedforward neural network with two hidden layers and 9 and 20, respectively, intermediate neurons. Good results obtained prove the efficiency of the optimization algorithms. More than the good results, what is important in this approach is the simulation methodology, including neural networks and optimization biologically inspired algorithms, which provide satisfactory results. In addition, the methodology developed in this approach is general and has flexibility so that it can be easily adapted to other processes in association with different types of models.Keywords: artificial neural networks, human behaviors of learning and cooperation, human competitive behavior, optimization algorithms
Procedia PDF Downloads 10719409 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.Keywords: color models, cultural heritage, laser scanner, photogrammetry
Procedia PDF Downloads 28019408 Effects of Machining Parameters on the Surface Roughness and Vibration of the Milling Tool
Authors: Yung C. Lin, Kung D. Wu, Wei C. Shih, Jui P. Hung
Abstract:
High speed and high precision machining have become the most important technology in manufacturing industry. The surface roughness of high precision components is regarded as the important characteristics of the product quality. However, machining chatter could damage the machined surface and restricts the process efficiency. Therefore, selection of the appropriate cutting conditions is of importance to prevent the occurrence of chatter. In addition, vibration of the spindle tool also affects the surface quality, which implies the surface precision can be controlled by monitoring the vibration of the spindle tool. Based on this concept, this study was aimed to investigate the influence of the machining conditions on the surface roughness and the vibration of the spindle tool. To this end, a series of machining tests were conducted on aluminum alloy. In tests, the vibration of the spindle tool was measured by using the acceleration sensors. The surface roughness of the machined parts was examined using white light interferometer. The response surface methodology (RSM) was employed to establish the mathematical models for predicting surface finish and tool vibration, respectively. The correlation between the surface roughness and spindle tool vibration was also analyzed by ANOVA analysis. According to the machining tests, machined surface with or without chattering was marked on the lobes diagram as the verification of the machining conditions. Using multivariable regression analysis, the mathematical models for predicting the surface roughness and tool vibrations were developed based on the machining parameters, cutting depth (a), feed rate (f) and spindle speed (s). The predicted roughness is shown to agree well with the measured roughness, an average percentage of errors of 10%. The average percentage of errors of the tool vibrations between the measurements and the predictions of mathematical model is about 7.39%. In addition, the tool vibration under various machining conditions has been found to have a positive influence on the surface roughness (r=0.78). As a conclusion from current results, the mathematical models were successfully developed for the predictions of the surface roughness and vibration level of the spindle tool under different cutting condition, which can help to select appropriate cutting parameters and to monitor the machining conditions to achieve high surface quality in milling operation.Keywords: machining parameters, machining stability, regression analysis, surface roughness
Procedia PDF Downloads 23119407 Prediction of Permeability of Frozen Unsaturated Soil Using Van Genuchten Model and Fredlund-Xing Model in Soil Vision
Authors: Bhavita S. Dave, Jaimin Vaidya, Chandresh H. Solanki, Atul K.
Abstract:
To measure the permeability of a soil specimen, one of the basic assumptions of Darcy's law is that the soil sample should be saturated. Unlike saturated soils, the permeability of unsaturated soils cannot be found using conventional methods as it does not follow Darcy's law. Many empirical models, such as the Van Genuchten Model and Fredlund-Xing Model were suggested to predict permeability value for unsaturated soil. Such models use data from the soil-freezing characteristic curve to find fitting parameters for frozen unsaturated soils. In this study, soil specimens were subjected to 0, 1, 3, and 5 freezing-thawing (F-T) cycles for different degrees of saturation to have a wide range of suction, and its soil freezing characteristic curves were formulated for all F-T cycles. Changes in fitting parameters and relative permeability with subsequent F-T cycles are presented in this paper for both models.Keywords: frozen unsaturated soil, Fredlund Xing model, soil-freezing characteristic curve, Van Genuchten model
Procedia PDF Downloads 18919406 Comparison of Solar Radiation Models
Authors: O. Behar, A. Khellaf, K. Mohammedi, S. Ait Kaci
Abstract:
Up to now, most validation studies have been based on the MBE and RMSE, and therefore, focused only on long and short terms performance to test and classify solar radiation models. This traditional analysis does not take into account the quality of modeling and linearity. In our analysis we have tested 22 solar radiation models that are capable to provide instantaneous direct and global radiation at any given location Worldwide. We introduce a new indicator, which we named Global Accuracy Indicator (GAI) to examine the linear relationship between the measured and predicted values and the quality of modeling in addition to long and short terms performance. Note that the quality of model has been represented by the T-Statistical test, the model linearity has been given by the correlation coefficient and the long and short term performance have been respectively known by the MBE and RMSE. An important founding of this research is that the use GAI allows avoiding default validation when using traditional methodology that might results in erroneous prediction of solar power conversion systems performances.Keywords: solar radiation model, parametric model, performance analysis, Global Accuracy Indicator (GAI)
Procedia PDF Downloads 35019405 Advancements in Laser Welding Process: A Comprehensive Model for Predictive Geometrical, Metallurgical, and Mechanical Characteristics
Authors: Seyedeh Fatemeh Nabavi, Hamid Dalir, Anooshiravan Farshidianfar
Abstract:
Laser welding is pivotal in modern manufacturing, offering unmatched precision, speed, and efficiency. Its versatility in minimizing heat-affected zones, seamlessly joining dissimilar materials, and working with various metals makes it indispensable for crafting intricate automotive components. Integration into automated systems ensures consistent delivery of high-quality welds, thereby enhancing overall production efficiency. Noteworthy are the safety benefits of laser welding, including reduced fumes and consumable materials, which align with industry standards and environmental sustainability goals. As the automotive sector increasingly demands advanced materials and stringent safety and quality standards, laser welding emerges as a cornerstone technology. A comprehensive model encompassing thermal dynamic and characteristics models accurately predicts geometrical, metallurgical, and mechanical aspects of the laser beam welding process. Notably, Model 2 showcases exceptional accuracy, achieving remarkably low error rates in predicting primary and secondary dendrite arm spacing (PDAS and SDAS). These findings underscore the model's reliability and effectiveness, providing invaluable insights and predictive capabilities crucial for optimizing welding processes and ensuring superior productivity, efficiency, and quality in the automotive industry.Keywords: laser welding process, geometrical characteristics, mechanical characteristics, metallurgical characteristics, comprehensive model, thermal dynamic
Procedia PDF Downloads 4819404 Neuroevolution Based on Adaptive Ensembles of Biologically Inspired Optimization Algorithms Applied for Modeling a Chemical Engineering Process
Authors: Sabina-Adriana Floria, Marius Gavrilescu, Florin Leon, Silvia Curteanu, Costel Anton
Abstract:
Neuroevolution is a subfield of artificial intelligence used to solve various problems in different application areas. Specifically, neuroevolution is a technique that applies biologically inspired methods to generate neural network architectures and optimize their parameters automatically. In this paper, we use different biologically inspired optimization algorithms in an ensemble strategy with the aim of training multilayer perceptron neural networks, resulting in regression models used to simulate the industrial chemical process of obtaining bricks from silicone-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. In addition, the initial conditions that were taken into account during the design and commissioning of the installation can change over time, which leads to the need to add new mixes to adjust the operating conditions for the desired purpose, e.g., material properties and energy saving. The present approach follows the study by simulation of a process of obtaining bricks from silicone-based materials, i.e., the modeling and optimization of the process. Optimization aims to determine the working conditions that minimize the emissions represented by nitrogen monoxide. We first use a search procedure to find the best values for the parameters of various biologically inspired optimization algorithms. Then, we propose an adaptive ensemble strategy that uses only a subset of the best algorithms identified in the search stage. The adaptive ensemble strategy combines the results of selected algorithms and automatically assigns more processing capacity to the more efficient algorithms. Their efficiency may also vary at different stages of the optimization process. In a given ensemble iteration, the most efficient algorithms aim to maintain good convergence, while the less efficient algorithms can improve population diversity. The proposed adaptive ensemble strategy outperforms the individual optimizers and the non-adaptive ensemble strategy in convergence speed, and the obtained results provide lower error values.Keywords: optimization, biologically inspired algorithm, neuroevolution, ensembles, bricks, emission minimization
Procedia PDF Downloads 11619403 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility
Authors: Fu Jinyu, Lin Jinguan
Abstract:
This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate
Procedia PDF Downloads 15519402 Analysis of the Interference from Risk-Determining Factors of Cooperative and Conventional Construction Contracts
Authors: E. Harrer, M. Mauerhofer, T. Werginz
Abstract:
As a result of intensive competition, the building sector is suffering from a high degree of rivalry. Furthermore, there can be observed an unbalanced distribution of project risks. Clients are aimed to shift their own risks into the sphere of the constructors or planners. The consequence of this is that the number of conflicts between the involved parties is inordinately high or even increasing; an alternative approach to counter on that developments are cooperative project forms in the construction sector. This research compares conventional contract models and models with partnering agreements to examine the influence on project risks by an early integration of the involved parties. The goal is to show up deviations in different project stages from the design phase to the project transfer phase. These deviations are evaluated by a survey of experts from the three spheres: clients, contractors and planners. By rating the influence of the participants on specific risk factors it is possible to identify factors which are relevant for a smooth project execution.Keywords: building projects, contract models, partnering, project risks
Procedia PDF Downloads 27419401 Characteristics of Business Models of Industrial-Internet-of-Things Platforms
Authors: Peter Kress, Alexander Pflaum, Ulrich Loewen
Abstract:
The number of Internet-of-Things (IoT) platforms is steadily increasing across various industries, especially for smart factories, smart homes and smart mobility. Also in the manufacturing industry, the number of Industrial-IoT platforms is growing. Both IT players, start-ups and increasingly also established industry players and small-and-medium-enterprises introduce offerings for the connection of industrial equipment on platforms, enabled by advanced information and communication technology. Beside the offered functionalities, the established ecosystem of partners around a platform is one of the key differentiators to generate a competitive advantage. The key question is how platform operators design the business model around their platform to attract a high number of customers and partners to co-create value for the entire ecosystem. The present research tries to answer this question by determining the key characteristics of business models of successful platforms in the manufacturing industry. To achieve that, the authors selected an explorative qualitative research approach and created an inductive comparative case study. The authors generated valuable descriptive insights of the business model elements (e.g., value proposition, pricing model or partnering model) of various established platforms. Furthermore, patterns across the various cases were identified to derive propositions for the successful design of business models of platforms in the manufacturing industry.Keywords: industrial-internet-of-things, business models, platforms, ecosystems, case study
Procedia PDF Downloads 24319400 Modelling Social Influence and Cultural Variation in Global Low-Carbon Vehicle Transitions
Authors: Hazel Pettifor, Charlie Wilson, David Mccollum, Oreane Edelenbosch
Abstract:
Vehicle purchase is a technology adoption decision that will strongly influence future energy and emission outcomes. Global integrated assessment models (IAMs) provide valuable insights into the medium and long terms effects of socio-economic development, technological change and climate policy. In this paper we present a unique and transparent approach for improving the behavioural representation of these models by incorporating social influence effects to more accurately represent consumer choice. This work draws together strong conceptual thinking and robust empirical evidence to introduce heterogeneous and interconnected consumers who vary in their aversion to new technologies. Focussing on vehicle choice, we conduct novel empirical research to parameterise consumer risk aversion and how this is shaped by social and cultural influences. We find robust evidence for social influence effects, and variation between countries as a function of cultural differences. We then formulate an approach to modelling social influence which is implementable in both simulation and optimisation-type models. We use two global integrated assessment models (IMAGE and MESSAGE) to analyse four scenarios that introduce social influence and cultural differences between regions. These scenarios allow us to explore the interactions between consumer preferences and social influence. We find that incorporating social influence effects into global models accelerates the early deployment of electric vehicles and stimulates more widespread deployment across adopter groups. Incorporating cultural variation leads to significant differences in deployment between culturally divergent regions such as the USA and China. Our analysis significantly extends the ability of global integrated assessment models to provide policy-relevant analysis grounded in real-world processes.Keywords: behavioural realism, electric vehicles, social influence, vehicle choice
Procedia PDF Downloads 18719399 Description of Decision Inconsistency in Intertemporal Choices and Representation of Impatience as a Reflection of Irrationality: Consequences in the Field of Personalized Behavioral Finance
Authors: Roberta Martino, Viviana Ventre
Abstract:
Empirical evidence has, over time, confirmed that the behavior of individuals is inconsistent with the descriptions provided by the Discounted Utility Model, an essential reference for calculating the utility of intertemporal prospects. The model assumes that individuals calculate the utility of intertemporal prospectuses by adding up the values of all outcomes obtained by multiplying the cardinal utility of the outcome by the discount function estimated at the time the outcome is received. The trend of the discount function is crucial for the preferences of the decision maker because it represents the perception of the future, and its trend causes temporally consistent or temporally inconsistent preferences. In particular, because different formulations of the discount function lead to various conclusions in predicting choice, the descriptive ability of models with a hyperbolic trend is greater than linear or exponential models. Suboptimal choices from any time point of view are the consequence of this mechanism, the psychological factors of which are encapsulated in the discount rate trend. In addition, analyzing the decision-making process from a psychological perspective, there is an equivalence between the selection of dominated prospects and a degree of impatience that decreases over time. The first part of the paper describes and investigates the anomalies of the discounted utility model by relating the cognitive distortions of the decision-maker to the emotional factors that are generated during the evaluation and selection of alternatives. Specifically, by studying the degree to which impatience decreases, it’s possible to quantify how the psychological and emotional mechanisms of the decision-maker result in a lack of decision persistence. In addition, this description presents inconsistency as the consequence of an inconsistent attitude towards time-delayed choices. The second part of the paper presents an experimental phase in which we show the relationship between inconsistency and impatience in different contexts. Analysis of the degree to which impatience decreases confirms the influence of the decision maker's emotional impulses for each anomaly in the utility model discussed in the first part of the paper. This work provides an application in the field of personalized behavioral finance. Indeed, the numerous behavioral diversities, evident even in the degrees of decrease in impatience in the experimental phase, support the idea that optimal strategies may not satisfy individuals in the same way. With the aim of homogenizing the categories of investors and to provide a personalized approach to advice, the results proven in the experimental phase are used in a complementary way with the information in the field of behavioral finance to implement the Analytical Hierarchy Process model in intertemporal choices, useful for strategic personalization. In the construction of the Analytic Hierarchy Process, the degree of decrease in impatience is understood as reflecting irrationality in decision-making and is therefore used for the construction of weights between anomalies and behavioral traits.Keywords: analytic hierarchy process, behavioral finance, financial anomalies, impatience, time inconsistency
Procedia PDF Downloads 6819398 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana
Authors: Gautier Viaud, Paul-Henry Cournède
Abstract:
Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models
Procedia PDF Downloads 30319397 Application of Data Mining Techniques for Tourism Knowledge Discovery
Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee
Abstract:
Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.Keywords: classification algorithms, data mining, knowledge discovery, tourism
Procedia PDF Downloads 29519396 Quantitative Structure-Property Relationship Study of Base Dissociation Constants of Some Benzimidazoles
Authors: Sanja O. Podunavac-Kuzmanović, Lidija R. Jevrić, Strahinja Z. Kovačević
Abstract:
Benzimidazoles are a group of compounds with significant antibacterial, antifungal and anticancer activity. The studied compounds consist of the main benzimidazole structure with different combinations of substituens. This study is based on the two-dimensional and three-dimensional molecular modeling and calculation of molecular descriptors (physicochemical and lipophilicity descriptors) of structurally diverse benzimidazoles. Molecular modeling was carried out by using ChemBio3D Ultra version 14.0 software. The obtained 3D models were subjected to energy minimization using molecular mechanics force field method (MM2). The cutoff for structure optimization was set at a gradient of 0.1 kcal/Åmol. The obtained set of molecular descriptors was used in principal component analysis (PCA) of possible similarities and dissimilarities among the studied derivatives. After the molecular modeling, the quantitative structure-property relationship (QSPR) analysis was applied in order to get the mathematical models which can be used in prediction of pKb values of structurally similar benzimidazoles. The obtained models are based on statistically valid multiple linear regression (MLR) equations. The calculated cross-validation parameters indicate the high prediction ability of the established QSPR models. This study is financially supported by COST action CM1306 and the project No. 114-451-347/2015-02, financially supported by the Provincial Secretariat for Science and Technological Development of Vojvodina.Keywords: benzimidazoles, chemometrics, molecular modeling, molecular descriptors, QSPR
Procedia PDF Downloads 287