Search results for: Performance-based models.
2404 Generation Expansion Planning Strategies on Power System: A Review
Authors: V. Phupha, T. Lantharthong, N. Rugthaicharoencheep
Abstract:
The problem of generation expansion planning (GEP) has been extensively studied for many years. This paper presents three topics in GEP as follow: statistical model, models for generation expansion, and expansion problem. In the topic of statistical model, the main stages of the statistical modeling are briefly explained. Some works on models for GEP are reviewed in the topic of models for generation expansion. Finally for the topic of expansion problem, the major issues in the development of a longterm expansion plan are summarized.Keywords: Generation expansion planning, strategies, power system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32152403 Estimation of Missing or Incomplete Data in Road Performance Measurement Systems
Authors: Kristjan Kuhi, Kati K. Kaare, Ott Koppel
Abstract:
Modern management in most fields is performance based; both planning and implementation of maintenance and operational activities are driven by appropriately defined performance indicators. Continuous real-time data collection for management is becoming feasible due to technological advancements. Outdated and insufficient input data may result in incorrect decisions. When using deterministic models the uncertainty of the object state is not visible thus applying the deterministic models are more likely to give false diagnosis. Constructing structured probabilistic models of the performance indicators taking into consideration the surrounding indicator environment enables to estimate the trustworthiness of the indicator values. It also assists to fill gaps in data to improve the quality of the performance analysis and management decisions. In this paper authors discuss the application of probabilistic graphical models in the road performance measurement and propose a high-level conceptual model that enables analyzing and predicting more precisely future pavement deterioration based on road utilization.
Keywords: Probabilistic graphical models, performance indicators, road performance management, data collection
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18342402 Numerical Simulation of a Single Air Bubble Rising in Water with Various Models of Surface Tension Force
Authors: Afshin Ahmadi Nadooshan, Ebrahim Shirani
Abstract:
Different numerical methods are employed and developed for simulating interfacial flows. A large range of applications belong to this group, e.g. two-phase flows of air bubbles in water or water drops in air. In such problems surface tension effects often play a dominant role. In this paper, various models of surface tension force for interfacial flows, the CSF, CSS, PCIL and SGIP models have been applied to simulate the motion of small air bubbles in water and the results were compared and reviewed. It has been pointed out that by using SGIP or PCIL models, we are able to simulate bubble rise and obtain results in close agreement with the experimental data.
Keywords: Volume-of-Fluid, Bubble Rising, SGIP model, CSS model, CSF model, PCIL model, interface, surface tension force.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17842401 A Computational Stochastic Modeling Formalism for Biological Networks
Authors: Werner Sandmann, Verena Wolf
Abstract:
Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.
Keywords: Computational Modeling, Biological Networks, Stochastic Models, Markov Chains, Transition Class Models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15792400 A New Divide and Conquer Software Process Model
Authors: Hina Gull, Farooque Azam, Wasi Haider Butt, Sardar Zafar Iqbal
Abstract:
The software system goes through a number of stages during its life and a software process model gives a standard format for planning, organizing and running a project. The article presents a new software development process model named as “Divide and Conquer Process Model", based on the idea first it divides the things to make them simple and then gathered them to get the whole work done. The article begins with the backgrounds of different software process models and problems in these models. This is followed by a new divide and conquer process model, explanation of its different stages and at the end edge over other models is shown.Keywords: Process Model, Waterfall, divide and conquer, Requirements.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19312399 A Comparison of the Sum of Squares in Linear and Partial Linear Regression Models
Authors: Dursun Aydın
Abstract:
In this paper, estimation of the linear regression model is made by ordinary least squares method and the partially linear regression model is estimated by penalized least squares method using smoothing spline. Then, it is investigated that differences and similarity in the sum of squares related for linear regression and partial linear regression models (semi-parametric regression models). It is denoted that the sum of squares in linear regression is reduced to sum of squares in partial linear regression models. Furthermore, we indicated that various sums of squares in the linear regression are similar to different deviance statements in partial linear regression. In addition to, coefficient of the determination derived in linear regression model is easily generalized to coefficient of the determination of the partial linear regression model. For this aim, it is made two different applications. A simulated and a real data set are considered to prove the claim mentioned here. In this way, this study is supported with a simulation and a real data example.Keywords: Partial Linear Regression Model, Linear RegressionModel, Residuals, Deviance, Smoothing Spline.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18722398 Fast Facial Feature Extraction and Matching with Artificial Face Models
Authors: Y. H. Tsai, Y. W. Chen
Abstract:
Facial features are frequently used to represent local properties of a human face image in computer vision applications. In this paper, we present a fast algorithm that can extract the facial features online such that they can give a satisfying representation of a face image. It includes one step for a coarse detection of each facial feature by AdaBoost and another one to increase the accuracy of the found points by Active Shape Models (ASM) in the regions of interest. The resulted facial features are evaluated by matching with artificial face models in the applications of physiognomy. The distance measure between the features and those in the fate models from the database is carried out by means of the Hausdorff distance. In the experiment, the proposed method shows the efficient performance in facial feature extractions and online system of physiognomy.Keywords: Facial feature extraction, AdaBoost, Active shapemodel, Hausdorff distance
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18122397 Implied Adjusted Volatility by Leland Option Pricing Models: Evidence from Australian Index Options
Authors: Mimi Hafizah Abdullah, Hanani Farhah Harun, Nik Ruzni Nik Idris
Abstract:
With the implied volatility as an important factor in financial decision-making, in particular in option pricing valuation, and also the given fact that the pricing biases of Leland option pricing models and the implied volatility structure for the options are related, this study considers examining the implied adjusted volatility smile patterns and term structures in the S&P/ASX 200 index options using the different Leland option pricing models. The examination of the implied adjusted volatility smiles and term structures in the Australian index options market covers the global financial crisis in the mid-2007. The implied adjusted volatility was found to escalate approximately triple the rate prior the crisis.
Keywords: Implied adjusted volatility, Financial crisis, Leland option pricing models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29452396 Dental Students’ Attitude towards Problem-Based Learning before and after Implementing 3D Electronic Dental Models
Authors: Hai Ming Wong, Kuen Wai Ma, Lavender Yu Xin Yang, Yanqi Yang
Abstract:
Objectives: In recent years, the Faculty of Dentistry of the University of Hong Kong have extended the implementation of 3D electronic models (e-models) into problem-based learning (PBL) of the Bachelor of Dental Surgery (BDS) curriculum, aiming at mutual enhancement of PBL teaching quality and the students’ skills in using e-models. This study focuses on the effectiveness of e-models serving as a tool to enhance the students’ skills and competences in PBL. Methods: The questionnaire surveys are conducted to measure 50 fourth-year BDS students’ attitude change between beginning and end of blended PBL tutorials. The response rate of this survey is 100%. Results: The results of this study show the students’ agreement on enhancement of their learning experience after e-model implementation and their expectation to have more blended PBL courses in the future. The potential of e-models in cultivating students’ self-learning skills reduces their dependence on others, while improving their communication skills to argue about pros and cons of different treatment options. The students’ independent thinking ability and problem solving skills are promoted by e-model implementation, resulting in better decision making in treatment planning. Conclusion: It is important for future dental education curriculum planning to cope with the students’ needs, and offer support in the form of software, hardware and facilitators’ assistance for better e-model implementation.
Keywords: Problem-Based learning, curriculum, dental education, 3-D electronic models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65282395 Comparative Study of Experimental and Theoretical Convective, Evaporative for Two Model Distiller
Authors: Khaoula Hidouri, Ali Benhmidene, Bechir Chouachi
Abstract:
The purification of brackish seawater becomes a necessity and not a choice against demographic and industrial growth especially in third world countries. Two models can be used in this work: simple solar still and simple solar still coupled with a heat pump. In this research, the productivity of water by Simple Solar Distiller (SSD) and Simple Solar Distiller Hybrid Heat Pump (SSDHP) was determined by the orientation, the use of heat pump, the simple or double glass cover. The productivity can exceed 1.2 L/m²h for the SSDHP and 0.5 L/m²h for SSD model. The result of the global efficiency is determined for two models SSD and SSDHP give respectively 30%, 50%. The internal efficiency attained 35% for SSD and 60% of the SSDHP models. Convective heat coefficient can be determined by attained 2.5 W/m²°C and 0.5 W/m²°C respectively for SSDHP and SSD models.
Keywords: Productivity, efficiency, convective heat coefficient, SSD model, SSDHP model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8102394 Simple and Advanced Models for Calculating Single-Phase Diode Rectifier Line-Side Harmonics
Authors: Hussein A. Kazem, Abdulhakeem Abdullah Albaloshi, Ali Said Ali Al-Jabri, Khamis Humaid AlSaidi
Abstract:
This paper proposes different methods for estimation of the harmonic currents of the single-phase diode bridge rectifier. Both simple and advanced methods are compared and the models are put into a context of practical use for calculating the harmonic distortion in a typical application. Finally, the different models are compared to measurements of a real application and convincing results are achieved.Keywords: Single-phase rectifier, line side Harmonics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 46362393 The Impact of Semantic Web on E-Commerce
Authors: Karim Heidari
Abstract:
Semantic Web Technologies enable machines to interpret data published in a machine-interpretable form on the web. At the present time, only human beings are able to understand the product information published online. The emerging semantic Web technologies have the potential to deeply influence the further development of the Internet Economy. In this paper we propose a scenario based research approach to predict the effects of these new technologies on electronic markets and business models of traders and intermediaries and customers. Over 300 million searches are conducted everyday on the Internet by people trying to find what they need. A majority of these searches are in the domain of consumer ecommerce, where a web user is looking for something to buy. This represents a huge cost in terms of people hours and an enormous drain of resources. Agent enabled semantic search will have a dramatic impact on the precision of these searches. It will reduce and possibly eliminate information asymmetry where a better informed buyer gets the best value. By impacting this key determinant of market prices semantic web will foster the evolution of different business and economic models. We submit that there is a need for developing these futuristic models based on our current understanding of e-commerce models and nascent semantic web technologies. We believe these business models will encourage mainstream web developers and businesses to join the “semantic web revolution."Keywords: E-Commerce, E-Business, Semantic Web, XML.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34612392 Travel Time Model for Cylinder Type Parking System
Authors: Jing Zhang, Jie Chen
Abstract:
In this paper, we mainly analyze an automated parking system where the storage and retrieval requests are performed by a tower crane. In this parking system, the S/R crane which is located at the middle of the bottom of the cylinder parking area can rotate in both clockwise and counterclockwise and three kinds of movements can be done simultaneously. We develop some mathematical travel time models for the single command cycle under the random storage assignment using the characteristics of this system. Finally, we compare these travel models with discrete case and it is shown that these travel models display a good satisfactory performance.Keywords: Parking system, travel time model, tower crane.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7932391 Object-Oriented Simulation of Simulating Anticipatory Systems
Authors: Eugene Kindler
Abstract:
The present paper is oriented to problems of simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. A certain analogy between use of simulation and imagining will be applied to make the explication more comprehensible. The paper will be completed by notes of problems and by some existing applications. The problems consist in the fact that simulation of the mentioned anticipatory systems end is simulation of simulating systems, i.e. in computer models handling two or more modeled time axes that should be mapped to real time flow in a nondescent manner. Languages oriented to objects, processes and blocks can be used to surmount the problems.
Keywords: Anticipatory systems, Nested computer models, Discrete event simulation, Simula.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14392390 Comparative Analysis of the Stochastic and Parsimonious Interest Rates Models on Croatian Government Market
Authors: Zdravka Aljinović, Branka Marasović, Blanka Škrabić
Abstract:
The paper provides a discussion of the most relevant aspects of yield curve modeling. Two classes of models are considered: stochastic and parsimonious function based, through the approaches developed by Vasicek (1977) and Nelson and Siegel (1987). Yield curve estimates for Croatia are presented and their dynamics analyzed and finally, a comparative analysis of models is conducted.Keywords: the term structure of interest rates, Vasicek model, Nelson-Siegel model, Croatian Government market.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15012389 Using Exponential Lévy Models to Study Implied Volatility patterns for Electricity Options
Authors: Pinho C., Madaleno M.
Abstract:
German electricity European options on futures using Lévy processes for the underlying asset are examined. Implied volatility evolution, under each of the considered models, is discussed after calibrating for the Merton jump diffusion (MJD), variance gamma (VG), normal inverse Gaussian (NIG), Carr, Geman, Madan and Yor (CGMY) and the Black and Scholes (B&S) model. Implied volatility is examined for the entire sample period, revealing some curious features about market evolution, where data fitting performances of the five models are compared. It is shown that variance gamma processes provide relatively better results and that implied volatility shows significant differences through time, having increasingly evolved. Volatility changes for changed uncertainty, or else, increasing futures prices and there is evidence for the need to account for seasonality when modelling both electricity spot/futures prices and volatility.Keywords: Calibration, Electricity Markets, Implied Volatility, Lévy Models, Options on Futures, Pricing
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 48072388 Project Management Maturity Models and Organizational Project Management Maturity Model (OPM3®): A Critical Morphological Evaluation
Authors: Farrokh J., Azhar K. Mansur
Abstract:
There exists a strong correlation between efficient project management and competitive advantage for organizations. Therefore, organizations are striving to standardize and assess the rigor of their project management processes and capabilities i.e. project management maturity. Researchers and standardization organizations have developed several project management maturity models (PMMMs) to assess project management maturity of the organizations. This study presents a critical evaluation of some of the leading PMMMs against OPM3® in a multitude of ways to look at which PMMM is the most comprehensive model - which could assess most aspects of organizations and also help the organizations in gaining competitive advantage over competitors. After a detailed morphological analysis of the models, it is concluded that OPM3® is the most promising maturity model that can really provide a competitive advantage to the organizations due to its unique approach of assessment and improvement strategies.
Keywords: Project management maturity, project managemen tmaturity models, competitive advantage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 50482387 Generalization of SGIP Surface Tension Force Model in Three-Dimensional Flows and Compare to Other Models in Interfacial Flows
Authors: Afshin Ahmadi Nadooshan, Ebrahim Shirani
Abstract:
In this paper, the two-dimensional stagger grid interface pressure (SGIP) model has been generalized and presented into three-dimensional form. For this purpose, various models of surface tension force for interfacial flows have been investigated and compared with each other. The VOF method has been used for tracking the interface. To show the ability of the SGIP model for three-dimensional flows in comparison with other models, pressure contours, maximum spurious velocities, norm spurious flow velocities and pressure jump error for motionless drop of liquid and bubble of gas are calculated using different models. It has been pointed out that SGIP model in comparison with the CSF, CSS and PCIL models produces the least maximum and norm spurious velocities. Additionally, the new model produces more accurate results in calculating the pressure jumps across the interface for motionless drop of liquid and bubble of gas which is generated in surface tension force. Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14122386 Software Effort Estimation Using Soft Computing Techniques
Authors: Parvinder S. Sandhu, Porush Bassi, Amanpreet Singh Brar
Abstract:
Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Keywords: Effort Estimation, Neural-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20752385 An Approach for Optimization of Functions and Reducing the Value of the Product by Using Virtual Models
Authors: A. Bocevska, G. Todorov, T. Neshkov
Abstract:
New developed approach for Functional Cost Analysis (FCA) based on virtual prototyping (VP) models in CAD/CAE environment, applicable and necessary in developing new products is presented. It is instrument for improving the value of the product while maintaining costs and/or reducing the costs of the product without reducing value. Five broad classes of VP methods are identified. Efficient use of prototypes in FCA is a vital activity that can make the difference between successful and unsuccessful entry of new products into the competitive word market. Successful realization of this approach is illustrated for a specific example using press joint power tool.
Keywords: CAD/CAE environment, Functional Cost Analysis (FCA), Virtual prototyping (VP) models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13332384 Geopotential Models Evaluation in Algeria Using Stochastic Method, GPS/Leveling and Topographic Data
Authors: M. A. Meslem
Abstract:
For precise geoid determination, we use a reference field to subtract long and medium wavelength of the gravity field from observations data when we use the remove-compute-restore technique. Therefore, a comparison study between considered models should be made in order to select the optimal reference gravity field to be used. In this context, two recent global geopotential models have been selected to perform this comparison study over Northern Algeria. The Earth Gravitational Model (EGM2008) and the Global Gravity Model (GECO) conceived with a combination of the first model with anomalous potential derived from a GOCE satellite-only global model. Free air gravity anomalies in the area under study have been used to compute residual data using both gravity field models and a Digital Terrain Model (DTM) to subtract the residual terrain effect from the gravity observations. Residual data were used to generate local empirical covariance functions and their fitting to the closed form in order to compare their statistical behaviors according to both cases. Finally, height anomalies were computed from both geopotential models and compared to a set of GPS levelled points on benchmarks using least squares adjustment. The result described in details in this paper regarding these two models has pointed out a slight advantage of GECO global model globally through error degree variances comparison and ground-truth evaluation.
Keywords: Quasigeoid, gravity anomalies, covariance, GGM.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8972383 New Multi-Solid Thermodynamic Model for the Prediction of Wax Formation
Authors: Ehsan Ghanaei, Feridun Esmaeilzadeh, Jamshid Fathi Kaljahi
Abstract:
In the previous multi-solid models,¤ò approach is used for the calculation of fugacity in the liquid phase. For the first time, in the proposed multi-solid thermodynamic model,γ approach has been used for calculation of fugacity in the liquid mixture. Therefore, some activity coefficient models have been studied that the results show that the predictive Wilson model is more appropriate than others. The results demonstrate γ approach using the predictive Wilson model is in more agreement with experimental data than the previous multi-solid models. Also, by this method, generates a new approach for presenting stability analysis in phase equilibrium calculations. Meanwhile, the run time in γ approach is less than the previous methods used ¤ò approach. The results of the new model present 0.75 AAD % (Average Absolute Deviation) from the experimental data which is less than the results error of the previous multi-solid models obviously.Keywords: Multi-solid thermodynamic model, PredictiveWilson model, Wax formation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19812382 A Model for Estimation of Efforts in Development of Software Systems
Authors: Parvinder S. Sandhu, Manisha Prashar, Pourush Bassi, Atul Bisht
Abstract:
Software effort estimation is the process of predicting the most realistic use of effort required to develop or maintain software based on incomplete, uncertain and/or noisy input. Effort estimates may be used as input to project plans, iteration plans, budgets. There are various models like Halstead, Walston-Felix, Bailey-Basili, Doty and GA Based models which have already used to estimate the software effort for projects. In this study Statistical Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are experimented to estimate the software effort for projects. The performances of the developed models were tested on NASA software project datasets and results are compared with the Halstead, Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based models mentioned in the literature. The result shows that the NF Model has the lowest MMRE and RMSE values. The NF Model shows the best results as compared with the Fuzzy-GA based hybrid Inference System and other existing Models that are being used for the Effort Prediction with lowest MMRE and RMSE values.Keywords: Neuro-Fuzzy Model, Halstead Model, Walston-Felix Model, Bailey-Basili Model, Doty Model, GA Based Model, Genetic Algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32272381 Circular Economy Maturity Models: A Systematic Literature Review
Authors: D. Kreutzer, S. Müller-Abdelrazeq, I. Isenhardt
Abstract:
Resource scarcity, energy transition and the planned climate neutrality pose enormous challenges for manufacturing companies. In order to achieve these goals and a holistic sustainable development, the European Union has listed the circular economy as part of the Circular Economy Action Plan. In addition to a reduction in resource consumption, reduced emissions of greenhouse gases and a reduced volume of waste, the principles of the circular economy also offer enormous economic potential for companies, such as the generation of new circular business models. However, many manufacturing companies, especially small and medium-sized enterprises, do not have the necessary capacity to plan their transformation. They need support and strategies on the path to circular transformation because this change affects not only production but also the entire company. Maturity models offer an approach to determine the current status of companies’ transformation processes. In addition, companies can use the models to identify transformation strategies and thus promote the transformation process. While maturity models are established in other areas, e.g., IT or project management, only a few circular economy maturity models can be found in the scientific literature. The aim of this paper is to analyze the identified maturity models of the circular economy through a systematic literature review (SLR) and, besides other aspects, to check their completeness as well as their quality. For this purpose, circular economy maturity models at the company's (micro) level were identified from the literature, compared, and analyzed with regard to their theoretical and methodological structure. A specific focus was placed, on the one hand, on the analysis of the business units considered in the respective models and, on the other hand, on the underlying metrics and indicators in order to determine the individual maturity level of the entire company. The results of the literature review show, for instance, a significant difference in the number and types of indicators as well as their metrics. For example, most models use subjective indicators and very few objective indicators in their surveys. It was also found that there are rarely well-founded thresholds between the levels. Based on the generated results, concrete ideas and proposals for a research agenda in the field of circular economy maturity models are made.
Keywords: Circular economy, maturity model, maturity assessment, systematic literature review.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2222380 Evaluation of Environmental, Technical, and Economic Indicators of a Fused Deposition Modeling Process
Authors: M. Yosofi, S. Ezeddini, A. Ollivier, V. Lavaste, C. Mayousse
Abstract:
Additive manufacturing processes have changed significantly in a wide range of industries and their application progressed from rapid prototyping to production of end-use products. However, their environmental impact is still a rather open question. In order to support the growth of this technology in the industrial sector, environmental aspects should be considered and predictive models may help monitor and reduce the environmental footprint of the processes. This work presents predictive models based on a previously developed methodology for the environmental impact evaluation combined with a technical and economical assessment. Here we applied the methodology to the Fused Deposition Modeling process. First, we present the predictive models relative to different types of machines. Then, we present a decision-making tool designed to identify the optimum manufacturing strategy regarding technical, economic, and environmental criteria.
Keywords: Additive manufacturing, decision-makings, environmental impact, predictive models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10602379 Performance of Heterogeneous Autoregressive Models of Realized Volatility: Evidence from U.S. Stock Market
Authors: Petr Seďa
Abstract:
This paper deals with heterogeneous autoregressive models of realized volatility (HAR-RV models) on high-frequency data of stock indices in the USA. Its aim is to capture the behavior of three groups of market participants trading on a daily, weekly and monthly basis and assess their role in predicting the daily realized volatility. The benefits of this work lies mainly in the application of heterogeneous autoregressive models of realized volatility on stock indices in the USA with a special aim to analyze an impact of the global financial crisis on applied models forecasting performance. We use three data sets, the first one from the period before the global financial crisis occurred in the years 2006-2007, the second one from the period when the global financial crisis fully hit the U.S. financial market in 2008-2009 years, and the last period was defined over 2010-2011 years. The model output indicates that estimated realized volatility in the market is very much determined by daily traders and in some cases excludes the impact of those market participants who trade on monthly basis.Keywords: Global financial crisis, heterogeneous autoregressive model, in-sample forecast, realized volatility, U.S. stock market.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24762378 An Improved Variable Tolerance RSM with a Proportion Threshold
Authors: Chen Wu, Youquan Xu, Dandan Li, Ronghua Yang, Lijuan Wang
Abstract:
In rough set models, tolerance relation, similarity relation and limited tolerance relation solve different situation problems for incomplete information systems in which there exists a phenomenon of missing value. If two objects have the same few known attributes and more unknown attributes, they cannot distinguish them well. In order to solve this problem, we presented two improved limited and variable precision rough set models. One is symmetric, the other one is non-symmetric. They all use more stringent condition to separate two small probability equivalent objects into different classes. The two models are needed to engage further study in detail. In the present paper, we newly form object classes with a different respect comparing to the first suggested model. We overcome disadvantages of non-symmetry regarding to the second suggested model. We discuss relationships between or among several models and also make rule generation. The obtained results by applying the second model are more accurate and reasonable.Keywords: Incomplete information system, rough set, symmetry, variable precision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8872377 Bridging the Gap between Different Interfaces for Business Process Modeling
Authors: Katalina Grigorova, Kaloyan Mironov
Abstract:
The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.Keywords: Business process modeling, business process modeling standards, workflow patterns, converting models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16682376 Automatic Generation of Ontology from Data Source Directed by Meta Models
Authors: Widad Jakjoud, Mohamed Bahaj, Jamal Bakkas
Abstract:
Through this paper we present a method for automatic generation of ontological model from any data source using Model Driven Architecture (MDA), this generation is dedicated to the cooperation of the knowledge engineering and software engineering. Indeed, reverse engineering of a data source generates a software model (schema of data) that will undergo transformations to generate the ontological model. This method uses the meta-models to validate software and ontological models.
Keywords: Meta model, model, ontology, data source.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19982375 Validating Condition-Based Maintenance Algorithms Through Simulation
Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile
Abstract:
Industrial end users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both Machine Learning and First Principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed from breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems and humans – including asset maintenance operations – in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.
Keywords: Degradation models, ageing, anomaly detection, soft sensor, incremental learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 328