Search results for: Process models
19724 Reduction of Energy Consumption of Distillation Process by Recovering the Heat from Exit Streams
Authors: Apichit Svang-Ariyaskul, Thanapat Chaireongsirikul, Pawit Tangviroon
Abstract:
Distillation consumes enormous quantity of energy. This work proposed a process to recover the energy from exit streams during the distillation process of three consecutive columns. There are several novel techniques to recover the heat with the distillation system; however, a complex control system is required. This work proposed a simpler technique by exchanging the heat between streams without interrupting the internal distillation process that might cause a serious control problem. The proposed process is executed by using heat exchanger network with pinch analysis to maximize the process heat recovery. The test model is the distillation of butane, pentane, hexane, and heptanes, which is a common mixture in the petroleum refinery. This proposed process saved the energy consumption for hot and cold utilities of 29 and 27%, which is considered significant. Therefore, the recovery of heat from exit streams from distillation process is proved to be effective for energy saving.Keywords: distillation, heat exchanger, network pinch analysis, chemical engineering
Procedia PDF Downloads 36919723 Modeling of Surge Corona Using Type94 in Overhead Power Lines
Authors: Zahira Anane, Abdelhafid Bayadi
Abstract:
Corona in the HV overhead transmission lines is an important source of attenuation and distortion of overvoltage surges. This phenomenon of distortion, which is superimposed on the distortion by skin effect, is due to the dissipation of energy by injection of space charges around the conductor, this process with place as soon as the instantaneous voltage exceeds the threshold voltage of the corona effect conductors. This paper presents a mathematical model to determine the corona inception voltage, the critical electric field and the corona radius, to predict the capacitive changes at conductor of transmission line due to corona. This model has been incorporated into the Alternative Transients Program version of the Electromagnetic Transients Program (ATP/EMTP) as a user defined component, using the MODELS interface with NORTON TYPE94 of this program and using the foreign subroutine. For obtained the displacement of corona charge hell, dichotomy mathematical method is used for this computation. The present corona model can be used for computing of distortion and attenuation of transient overvoltage waves being propagated in a transmission line of the very high voltage electric power.Keywords: high voltage, corona, Type94 NORTON, dichotomy, ATP/EMTP, MODELS, distortion, foreign model
Procedia PDF Downloads 62519722 Evolving Software Assessment and Certification Models Using Ant Colony Optimization Algorithm
Authors: Saad M. Darwish
Abstract:
Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However, these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.Keywords: software quality, quality assurance, software certification model, software assessment
Procedia PDF Downloads 52319721 UniFi: Universal Filter Model for Image Enhancement
Authors: Aleksei Samarin, Artyom Nazarenko, Valentin Malykh
Abstract:
Image enhancement is becoming more and more popular, especially on mobile devices. Nowadays, it is a common approach to enhance an image using a convolutional neural network (CNN). Such a network should be of significant size; otherwise, a possibility for the artifacts to occur is overgrowing. The existing large CNNs are computationally expensive, which could be crucial for mobile devices. Another important flaw of such models is they are poorly interpretable. There is another approach to image enhancement, namely, the usage of predefined filters in combination with the prediction of their applicability. We present an approach following this paradigm, which outperforms both existing CNN-based and filter-based approaches in the image enhancement task. It is easily adaptable for mobile devices since it has only 47 thousand parameters. It shows the best SSIM 0.919 on RANDOM250 (MIT Adobe FiveK) among small models and is thrice faster than previous models.Keywords: universal filter, image enhancement, neural networks, computer vision
Procedia PDF Downloads 10119720 Analysis of Hard Turning Process of AISI D3-Thermal Aspects
Authors: B. Varaprasad, C. Srinivasa Rao
Abstract:
In the manufacturing sector, hard turning has emerged as vital machining process for cutting hardened steels. Besides many advantages of hard turning operation, one has to implement to achieve close tolerances in terms of surface finish, high product quality, reduced machining time, low operating cost and environmentally friendly characteristics. In the present study, three-dimensional CAE (Computer Aided Engineering) based simulation of hard turning by using commercial software DEFORM 3D has been compared to experimental results of stresses, temperatures and tool forces in machining of AISI D3 steel using mixed Ceramic inserts (CC6050). In the present analysis, orthogonal cutting models are proposed, considering several processing parameters such as cutting speed, feed, and depth of cut. An exhaustive friction modeling at the tool-work interfaces is carried out. Work material flow around the cutting edge is carefully modeled with adaptive re-meshing simulation capability. In process simulations, feed rate and cutting speed are constant (i.e.,. 0.075 mm/rev and 155 m/min), and analysis is focused on stresses, forces, and temperatures during machining. Close agreement is observed between CAE simulation and experimental values.Keywords: hard turning, computer aided engineering, computational machining, finite element method
Procedia PDF Downloads 45419719 Organisational Change: The Impact on Employees and Organisational Development
Authors: Maureen Royce, Joshi Jariwala, Sally Kah
Abstract:
Change is inevitable, but the change process is progressive. Organisational change is the process in which an organisation changes strategies, operational methods, systems, culture, and structure to affect something different in the organisation. This process can be continuous or developed over a period and driven by internal and external factors. Organisational change is essential if organisations are to survive in dynamic and uncertain environments. However, evidence from research shows that many change initiatives fail, leading to severe consequences for organisations and their resources. The complex models of third sector organisations, i.e., social enterprise, compounds the levels of change in these organisations. Interestingly, innovation is associated with a change in social enterprises due to the hybridity of product and service development. Furthermore, the creation of social intervention has offered a new process and outcomes to the lifecycle of change. Therefore, different forms of organisational innovation are developed, i.e., total, evolutionary, expansionary, and developmental, which affect the interventions of social enterprises. This raises both theoretical and business concerns on how the competing hybrid nature of social enterprises change, how change is managed, and the impact on these organisations. These perspectives present critical questions for further investigation. In this study, we investigate the impact of organisational change on employees and organisational development at DaDaFest –a disability arts organisation with a social focus based in Liverpool. The three main objectives are to explore the drivers of change and the implementation process; to examine the impact of organisational change on employees and; to identify barriers to organisation change and development. To address the preceding research objectives, qualitative research design is adopted using semi-structured interviews. Data is analysed using a six-step thematic analysis framework, which enables the study to develop themes depicting the impact of change on employees and organisational development. This study presents theoretical and practical contributions for academics and practitioners. The knowledge contributions encapsulate the evolution of change and the change cycle in a social enterprise. However, practical implications provide critical insights into the change management process and the impact of change on employees and organisational development.Keywords: organisational change, change management, organisational change system, social enterprise
Procedia PDF Downloads 12619718 Characteristics of Inclusive Circular Business Models in Social Entrepreneurship
Authors: Svitlana Yermak, Olubukola Aluko
Abstract:
The purpose of this study was a literature review on the topic of social entrepreneurship, a review of new trends and best practices, the study of existing inclusive business models and their interaction with the principles of the circular economy for possible implementation in the practice of Ukraine in war and post-war times in conditions of scarce resources. Thus, three research questions were identified and substantiated: to determine the characteristics of social entrepreneurship, consider the features in Ukraine and the UK; highlight the criteria for inclusion in social entrepreneurship and its legal support; explore examples of existing inclusive circular business models to illustrate how the two concepts may be combined. A detailed review of the literature selected from the Scopus and Web of Science databases was carried out. The study revealed signs of social entrepreneurship, the main of which are doing business and making a profit, as well as the social orientation of the business, which is prescribed in the constituent documents of the enterprise immediately upon its creation. Considered are the characteristics of social entrepreneurship in the UK and Ukraine. It has been established that in the UK, social entrepreneurship is clearly regulated by the state; there are special legislative norms and support programs, in contrast to Ukraine, where these processes are only partially regulated. The study identified the main criteria for inclusion in inclusive circular business models: economic (sustainability and efficiency, job creation and economic growth, promotion of local development), social (accessibility, equity and fairness, inclusion and participation), and resources in their interconnection. It is substantiated that the resource criterion is especially important for this type of business model. It provides for the efficient and sustainable use of resources, as well as the cyclical nature of resources. And it was concluded that the principles of the circular economy not only do not contradict but, on the contrary, complement and expand the inclusive business models on which social entrepreneurship is based.Keywords: social entrepreneurship, inclusive business models, circular economy, inclusion criteria
Procedia PDF Downloads 10119717 Comparison of Machine Learning Models for the Prediction of System Marginal Price of Greek Energy Market
Authors: Ioannis P. Panapakidis, Marios N. Moschakis
Abstract:
The Greek Energy Market is structured as a mandatory pool where the producers make their bid offers in day-ahead basis. The System Operator solves an optimization routine aiming at the minimization of the cost of produced electricity. The solution of the optimization problem leads to the calculation of the System Marginal Price (SMP). Accurate forecasts of the SMP can lead to increased profits and more efficient portfolio management from the producer`s perspective. Aim of this study is to provide a comparative analysis of various machine learning models such as artificial neural networks and neuro-fuzzy models for the prediction of the SMP of the Greek market. Machine learning algorithms are favored in predictions problems since they can capture and simulate the volatilities of complex time series.Keywords: deregulated energy market, forecasting, machine learning, system marginal price
Procedia PDF Downloads 21519716 Dynamic Modeling of Advanced Wastewater Treatment Plants Using BioWin
Authors: Komal Rathore, Aydin Sunol, Gita Iranipour, Luke Mulford
Abstract:
Advanced wastewater treatment plants have complex biological kinetics, time variant influent flow rates and long processing times. Due to these factors, the modeling and operational control of advanced wastewater treatment plants become complicated. However, development of a robust model for advanced wastewater treatment plants has become necessary in order to increase the efficiency of the plants, reduce energy costs and meet the discharge limits set by the government. A dynamic model was designed using the Envirosim (Canada) platform software called BioWin for several wastewater treatment plants in Hillsborough County, Florida. Proper control strategies for various parameters such as mixed liquor suspended solids, recycle activated sludge and waste activated sludge were developed for models to match the plant performance. The models were tuned using both the influent and effluent data from the plant and their laboratories. The plant SCADA was used to predict the influent wastewater rates and concentration profiles as a function of time. The kinetic parameters were tuned based on sensitivity analysis and trial and error methods. The dynamic models were validated by using experimental data for influent and effluent parameters. The dissolved oxygen measurements were taken to validate the model by coupling them with Computational Fluid Dynamics (CFD) models. The Biowin models were able to exactly mimic the plant performance and predict effluent behavior for extended periods. The models are useful for plant engineers and operators as they can take decisions beforehand by predicting the plant performance with the use of BioWin models. One of the important findings from the model was the effects of recycle and wastage ratios on the mixed liquor suspended solids. The model was also useful in determining the significant kinetic parameters for biological wastewater treatment systems.Keywords: BioWin, kinetic modeling, flowsheet simulation, dynamic modeling
Procedia PDF Downloads 15419715 Mission Driven Enterprises in Ecosystems as Drivers for Sustainable System Change
Authors: Monique de Ritter, Annemieke Roobeek
Abstract:
This study takes a holistic multi-layered systems approach on entrepreneurship, innovation, and sustainability. Concretely we looked how mission driven entrepreneurs (level 1) employ new business models and launch innovative products and/or ideas in their enterprises, which are (level 2) operating in entrepreneurial ecosystems (level 3), and how these in turn may generate higher level sustainable change (level 4). We employed a qualitative grounded research approach in which our aim is to contribute to theory. Fourteen in-depth semi-structured interviews were conducted with mission driven entrepreneurs in the Netherlands in which their individual drives, business models, and ecosystems were discussed. Interview transcripts were systematically coded and analysed and the ecosystems were visually mapped. Most important patterns include 1) entrepreneurs have a clear sustainable mission and regard this mission as de raison d’être of their enterprise; 2) entrepreneurs employ new business models with a focus on collaboration for innovation; the business model supports or enhances the sustainable mission of the enterprise, 3) entrepreneurs collaborate in ecosystems in which a) they also regard suppliers as partners for innovation and clients as ambassadors for the sustainable mission, b) would like to improve their relationships with financial institutions as they are in the entrepreneurs’ perspective often lagging behind with their innovative ideas and models, c) they collaborate for knowledge and innovation with several parties, d) personal informal connections are very important, and e) in which the higher sustainable mission is not a point of competition but of collaboration.Keywords: sustainability, entrepreneurship, innovation, ecosystem, business models
Procedia PDF Downloads 37419714 A Fractional Derivative Model to Quantify Non-Darcy Flow in Porous and Fractured Media
Authors: Golden J. Zhang, Dongbao Zhou
Abstract:
Darcy’s law is the fundamental theory in fluid dynamics and engineering applications. Although Darcy linearity was found to be valid for slow, viscous flow, non-linear and non-Darcian flow has been well documented under both small and large velocity fluid flow. Various classical models were proposed and used widely to quantify non-Darcian flow, including the well-known Forchheimer, Izbash, and Swartzendruber models. Applications, however, revealed limitations of these models. Here we propose a general model built upon the Caputo fractional derivative to quantify non-Darcian flow for various flows (laminar to turbulence).Real-world applications and model comparisons showed that the new fractional-derivative model, which extends the fractional model proposed recently by Zhou and Yang (2018), can capture the non-Darcian flow in the relatively small velocity in low-permeability deposits and the relatively high velocity in high-permeability sand. A scale effect was also identified for non-Darcian flow in fractured rocks. Therefore, fractional calculus may provide an efficient tool to improve classical models to quantify fluid dynamics in aquatic environments.Keywords: fractional derivative, darcy’s law, non-darcian flow, fluid dynamics
Procedia PDF Downloads 12619713 GIS Based Spatial Modeling for Selecting New Hospital Sites Using APH, Entropy-MAUT and CRITIC-MAUT: A Study in Rural West Bengal, India
Authors: Alokananda Ghosh, Shraban Sarkar
Abstract:
The study aims to identify suitable sites for new hospitals with critical obstetric care facilities in Birbhum, one of the vulnerable and underserved districts of Eastern India, considering six main and 14 sub-criteria, using GIS-based Analytic Hierarchy Process (AHP) and Multi-Attribute Utility Theory (MAUT) approach. The criteria were identified through field surveys and previous literature. After collecting expert decisions, a pairwise comparison matrix was prepared using the Saaty scale to calculate the weights through AHP. On the contrary, objective weighting methods, i.e., Entropy and Criteria Importance through Interaction Correlation (CRITIC), were used to perform the MAUT. Finally, suitability maps were prepared by weighted sum analysis. Sensitivity analyses of AHP were performed to explore the effect of dominant criteria. Results from AHP reveal that ‘maternal death in transit’ followed by ‘accessibility and connectivity’, ‘maternal health care service (MHCS) coverage gap’ were three important criteria with comparatively higher weighted values. Whereas ‘accessibility and connectivity’ and ‘maternal death in transit’ were observed to have more imprint in entropy and CRITIC, respectively. While comparing the predictive suitable classes of these three models with the layer of existing hospitals, except Entropy-MAUT, the other two are pointing towards the left-over underserved areas of existing facilities. Only 43%-67% of existing hospitals were in the moderate to lower suitable class. Therefore, the results of the predictive models might bring valuable input in future planning.Keywords: hospital site suitability, analytic hierarchy process, multi-attribute utility theory, entropy, criteria importance through interaction correlation, multi-criteria decision analysis
Procedia PDF Downloads 6619712 The Impact of Governance Criteria in the Supplier Selection Process of Large German Companies
Authors: Christoph Köster
Abstract:
Supplier selection is one of the key challenges in supply chain management and can be considered a multi-criteria decision-making (MCDM) problem. In the 1960s, it evolved from considering only economic criteria, such as price, quality, and performance, to including environmental and social criteria nowadays. Although receiving considerable attention from scholars and practitioners over the past decades, existing research has not considered governance criteria so far. This is, however, surprising, as ESG (environmental, social, and governance) criteria have gained considerable attention. In order to complement ESG criteria in the supplier selection process, this study investigates German DAX and MDAX companies and evaluates the impact of governance criteria along their supplier selection process. Moreover, it proposes a set of criteria for the respective process steps. Specifically, eleven criteria for the first process step and five criteria for the second process step are identified. This paper contributes to a better understanding of the supplier selection process by elucidating the relevance of governance criteria in the supplier selection process and providing a set of empirically developed governance criteria. These results can be applied by practitioners to complement the criteria set in the supplier selection process and thus balance economic, environmental, social, and governance targets.Keywords: ESG, governance, sustainable supplier selection, sustainability
Procedia PDF Downloads 11819711 Software Engineering Inspired Cost Estimation for Process Modelling
Authors: Felix Baumann, Aleksandar Milutinovic, Dieter Roller
Abstract:
Up to this point business process management projects in general and business process modelling projects in particular could not rely on a practical and scientifically validated method to estimate cost and effort. Especially the model development phase is not covered by a cost estimation method or model. Further phases of business process modelling starting with implementation are covered by initial solutions which are discussed in the literature. This article proposes a method of filling this gap by deriving a cost estimation method from available methods in similar domains namely software development or software engineering. Software development is regarded as closely similar to process modelling as we show. After the proposition of this method different ideas for further analysis and validation of the method are proposed. We derive this method from COCOMO II and Function Point which are established methods of effort estimation in the domain of software development. For this we lay out similarities of the software development rocess and the process of process modelling which is a phase of the Business Process Management life-cycle.Keywords: COCOMO II, busines process modeling, cost estimation method, BPM COCOMO
Procedia PDF Downloads 44019710 Using Arellano-Bover/Blundell-Bond Estimator in Dynamic Panel Data Analysis – Case of Finnish Housing Price Dynamics
Authors: Janne Engblom, Elias Oikarinen
Abstract:
A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models are dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Arellano-Bover/Blundell-Bond Generalized method of moments (GMM) estimator which is an extension of the Arellano-Bond model where past values and different transformations of past values of the potentially problematic independent variable are used as instruments together with other instrumental variables. The Arellano–Bover/Blundell–Bond estimator augments Arellano–Bond by making an additional assumption that first differences of instrument variables are uncorrelated with the fixed effects. This allows the introduction of more instruments and can dramatically improve efficiency. It builds a system of two equations—the original equation and the transformed one—and is also known as system GMM. In this study, Finnish housing price dynamics were examined empirically by using the Arellano–Bover/Blundell–Bond estimation technique together with ordinary OLS. The aim of the analysis was to provide a comparison between conventional fixed-effects panel data models and dynamic panel data models. The Arellano–Bover/Blundell–Bond estimator is suitable for this analysis for a number of reasons: It is a general estimator designed for situations with 1) a linear functional relationship; 2) one left-hand-side variable that is dynamic, depending on its own past realizations; 3) independent variables that are not strictly exogenous, meaning they are correlated with past and possibly current realizations of the error; 4) fixed individual effects; and 5) heteroskedasticity and autocorrelation within individuals but not across them. Based on data of 14 Finnish cities over 1988-2012 differences of short-run housing price dynamics estimates were considerable when different models and instrumenting were used. Especially, the use of different instrumental variables caused variation of model estimates together with their statistical significance. This was particularly clear when comparing estimates of OLS with different dynamic panel data models. Estimates provided by dynamic panel data models were more in line with theory of housing price dynamics.Keywords: dynamic model, fixed effects, panel data, price dynamics
Procedia PDF Downloads 150819709 Research on the Evaluation and Delineation of Value Units of New Industrial Parks Based on Implementation-Orientation
Authors: Chengfang Wang, Zichao Wu, Jianying Zhou
Abstract:
At present, much attention is paid to the development of new industrial parks in the era of inventory planning. Generally speaking, there are two types of development models: incremental development models and stock development models. The former relies on key projects to build a value innovation park, and the latter relies on the iterative update of the park to build a value innovation park. Take the Baiyun Western Digital Park as an example, considering the growth model of value units, determine the evaluation target. Based on a GIS platform, comprehensive land-use status, regulatory detailed planning, land use planning, blue-green ecological base, rail transit system, road network system, industrial park distribution, public service facilities, and other factors are used to carry out the land use within the planning multi-factor superimposed comprehensive evaluation, constructing a value unit evaluation system, and delineating value units based on implementation orientation and combining two different development models. The research hopes to provide a reference for the planning and construction of new domestic industrial parks.Keywords: value units, GIS, multi-factor evaluation, implementation orientation
Procedia PDF Downloads 18819708 Data Modeling and Calibration of In-Line Pultrusion and Laser Ablation Machine Processes
Authors: David F. Nettleton, Christian Wasiak, Jonas Dorissen, David Gillen, Alexandr Tretyak, Elodie Bugnicourt, Alejandro Rosales
Abstract:
In this work, preliminary results are given for the modeling and calibration of two inline processes, pultrusion, and laser ablation, using machine learning techniques. The end product of the processes is the core of a medical guidewire, manufactured to comply with a user specification of diameter and flexibility. An ensemble approach is followed which requires training several models. Two state of the art machine learning algorithms are benchmarked: Kernel Recursive Least Squares (KRLS) and Support Vector Regression (SVR). The final objective is to build a precise digital model of the pultrusion and laser ablation process in order to calibrate the resulting diameter and flexibility of a medical guidewire, which is the end product while taking into account the friction on the forming die. The result is an ensemble of models, whose output is within a strict required tolerance and which covers the required range of diameter and flexibility of the guidewire end product. The modeling and automatic calibration of complex in-line industrial processes is a key aspect of the Industry 4.0 movement for cyber-physical systems.Keywords: calibration, data modeling, industrial processes, machine learning
Procedia PDF Downloads 29719707 Engaging Students in Learning through Visual Demonstration Models in Engineering Education
Authors: Afsha Shaikh, Mohammed Azizur Rahman, Ibrahim Hassan, Mayur Pal
Abstract:
Student engagement in learning is instantly affected by the sources of learning methods available for them, such as videos showing the applications of the concept or showing a practical demonstration. Specific to the engineering discipline, there exist enormous challenging concepts that can be simplified when they are connected to real-world scenarios. For this study, the concept of heat exchangers was used as it is a part of multidisciplinary engineering fields. To make the learning experience enjoyable and impactful, 3-D printed heat exchanger models were created for students to use while working on in-class activities and assignments. Students were encouraged to use the 3-D printed heat exchanger models to enhance their understanding of theoretical concepts associated with its applications. To assess the effectiveness of the method, feedback was received by students pursuing undergraduate engineering via an anonymous electronic survey. To make the feedback more realistic, unbiased, and genuine, students spent nearly two to three weeks using the models in their in-class assignments. The impact of these tools on their learning was assessed through their performance in their ungraded assignments as well as their interactive discussions with peers. ‘Having to apply the theory learned in class whilst discussing with peers on a class assignment creates a relaxed and stress-free learning environment in classrooms’; this feedback was received by more than half the students who took the survey and found 3-D models of heat exchanger very easy to use. Amongst many ways to enhance learning and make students more engaged through interactive models, this study sheds light on the importance of physical tools that help create a lasting mental representation in the minds of students. Moreover, in this technologically enhanced era, the concept of augmented reality was considered in this research. E-drawings application was recommended to enhance the vision of engineering students so they can see multiple views of the detailed 3-D models and cut through its different sides and angles to visualize it properly. E-drawings could be the next tool to implement in classrooms to enhance students’ understanding of engineering concepts.Keywords: student engagement, life-long-learning, visual demonstration, 3-D printed models, engineering education
Procedia PDF Downloads 11519706 Suitability of Black Box Approaches for the Reliability Assessment of Component-Based Software
Authors: Anjushi Verma, Tirthankar Gayen
Abstract:
Although, reliability is an important attribute of quality, especially for mission critical systems, yet, there does not exist any versatile model even today for the reliability assessment of component-based software. The existing Black Box models are found to make various assumptions which may not always be realistic and may be quite contrary to the actual behaviour of software. They focus on observing the manner in which the system behaves without considering the structure of the system, the components composing the system, their interconnections, dependencies, usage frequencies, etc.As a result, the entropy (uncertainty) in assessment using these models is much high.Though, there are some models based on operation profile yet sometimes it becomes extremely difficult to obtain the exact operation profile concerned with a given operation. This paper discusses the drawbacks, deficiencies and limitations of Black Box approaches from the perspective of various authors and finally proposes a conceptual model for the reliability assessment of software.Keywords: black box, faults, failure, software reliability
Procedia PDF Downloads 44319705 A Study of Hamilton-Jacobi-Bellman Equation Systems Arising in Differential Game Models of Changing Society
Authors: Weihua Ruan, Kuan-Chou Chen
Abstract:
This paper is concerned with a system of Hamilton-Jacobi-Bellman equations coupled with an autonomous dynamical system. The mathematical system arises in the differential game formulation of political economy models as an infinite-horizon continuous-time differential game with discounted instantaneous payoff rates and continuously and discretely varying state variables. The existence of a weak solution of the PDE system is proven and a computational scheme of approximate solution is developed for a class of such systems. A model of democratization is mathematically analyzed as an illustration of application.Keywords: Hamilton-Jacobi-Bellman equations, infinite-horizon differential games, continuous and discrete state variables, political-economy models
Procedia PDF Downloads 37719704 Machine Learning Techniques to Develop Traffic Accident Frequency Prediction Models
Authors: Rodrigo Aguiar, Adelino Ferreira
Abstract:
Road traffic accidents are the leading cause of unnatural death and injuries worldwide, representing a significant problem of road safety. In this context, the use of artificial intelligence with advanced machine learning techniques has gained prominence as a promising approach to predict traffic accidents. This article investigates the application of machine learning algorithms to develop traffic accident frequency prediction models. Models are evaluated based on performance metrics, making it possible to do a comparative analysis with traditional prediction approaches. The results suggest that machine learning can provide a powerful tool for accident prediction, which will contribute to making more informed decisions regarding road safety.Keywords: machine learning, artificial intelligence, frequency of accidents, road safety
Procedia PDF Downloads 8919703 Operations Research Applications in Audit Planning and Scheduling
Authors: Abdel-Aziz M. Mohamed
Abstract:
This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.Keywords: operations research applications, audit frequency, audit-staff scheduling, audit planning
Procedia PDF Downloads 81519702 Second Order Cone Optimization Approach to Two-stage Network DEA
Authors: K. Asanimoghadam, M. Salahi, A. Jamalian
Abstract:
Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.Keywords: network DEA, conic optimization, undesirable output, SBM
Procedia PDF Downloads 19419701 Robust Variable Selection Based on Schwarz Information Criterion for Linear Regression Models
Authors: Shokrya Saleh A. Alshqaq, Abdullah Ali H. Ahmadini
Abstract:
The Schwarz information criterion (SIC) is a popular tool for selecting the best variables in regression datasets. However, SIC is defined using an unbounded estimator, namely, the least-squares (LS), which is highly sensitive to outlying observations, especially bad leverage points. A method for robust variable selection based on SIC for linear regression models is thus needed. This study investigates the robustness properties of SIC by deriving its influence function and proposes a robust SIC based on the MM-estimation scale. The aim of this study is to produce a criterion that can effectively select accurate models in the presence of vertical outliers and high leverage points. The advantages of the proposed robust SIC is demonstrated through a simulation study and an analysis of a real dataset.Keywords: influence function, robust variable selection, robust regression, Schwarz information criterion
Procedia PDF Downloads 13919700 Predicting Loss of Containment in Surface Pipeline using Computational Fluid Dynamics and Supervised Machine Learning Model to Improve Process Safety in Oil and Gas Operations
Authors: Muhammmad Riandhy Anindika Yudhy, Harry Patria, Ramadhani Santoso
Abstract:
Loss of containment is the primary hazard that process safety management is concerned within the oil and gas industry. Escalation to more serious consequences all begins with the loss of containment, starting with oil and gas release from leakage or spillage from primary containment resulting in pool fire, jet fire and even explosion when reacted with various ignition sources in the operations. Therefore, the heart of process safety management is avoiding loss of containment and mitigating its impact through the implementation of safeguards. The most effective safeguard for the case is an early detection system to alert Operations to take action prior to a potential case of loss of containment. The detection system value increases when applied to a long surface pipeline that is naturally difficult to monitor at all times and is exposed to multiple causes of loss of containment, from natural corrosion to illegal tapping. Based on prior researches and studies, detecting loss of containment accurately in the surface pipeline is difficult. The trade-off between cost-effectiveness and high accuracy has been the main issue when selecting the traditional detection method. The current best-performing method, Real-Time Transient Model (RTTM), requires analysis of closely positioned pressure, flow and temperature (PVT) points in the pipeline to be accurate. Having multiple adjacent PVT sensors along the pipeline is expensive, hence generally not a viable alternative from an economic standpoint.A conceptual approach to combine mathematical modeling using computational fluid dynamics and a supervised machine learning model has shown promising results to predict leakage in the pipeline. Mathematical modeling is used to generate simulation data where this data is used to train the leak detection and localization models. Mathematical models and simulation software have also been shown to provide comparable results with experimental data with very high levels of accuracy. While the supervised machine learning model requires a large training dataset for the development of accurate models, mathematical modeling has been shown to be able to generate the required datasets to justify the application of data analytics for the development of model-based leak detection systems for petroleum pipelines. This paper presents a review of key leak detection strategies for oil and gas pipelines, with a specific focus on crude oil applications, and presents the opportunities for the use of data analytics tools and mathematical modeling for the development of robust real-time leak detection and localization system for surface pipelines. A case study is also presented.Keywords: pipeline, leakage, detection, AI
Procedia PDF Downloads 19119699 MITOS-RCNN: Mitotic Figure Detection in Breast Cancer Histopathology Images Using Region Based Convolutional Neural Networks
Authors: Siddhant Rao
Abstract:
Studies estimate that there will be 266,120 new cases of invasive breast cancer and 40,920 breast cancer induced deaths in the year of 2018 alone. Despite the pervasiveness of this affliction, the current process to obtain an accurate breast cancer prognosis is tedious and time consuming. It usually requires a trained pathologist to manually examine histopathological images and identify the features that characterize various cancer severity levels. We propose MITOS-RCNN: a region based convolutional neural network (RCNN) geared for small object detection to accurately grade one of the three factors that characterize tumor belligerence described by the Nottingham Grading System: mitotic count. Other computational approaches to mitotic figure counting and detection do not demonstrate ample recall or precision to be clinically viable. Our models outperformed all previous participants in the ICPR 2012 challenge, the AMIDA 2013 challenge and the MITOS-ATYPIA-14 challenge along with recently published works. Our model achieved an F- measure score of 0.955, a 6.11% improvement in accuracy from the most accurate of the previously proposed models.Keywords: breast cancer, mitotic count, machine learning, convolutional neural networks
Procedia PDF Downloads 22319698 Removal of Cr (VI) from Water through Adsorption Process Using GO/PVA as Nanosorbent
Authors: Syed Hadi Hasan, Devendra Kumar Singh, Viyaj Kumar
Abstract:
Cr (VI) is a known toxic heavy metal and has been considered as a priority pollutant in water. The effluent of various industries including electroplating, anodizing baths, leather tanning, steel industries and chromium based catalyst are the major source of Cr (VI) contamination in the aquatic environment. Cr (VI) show high mobility in the environment and can easily penetrate cell membrane of the living tissues to exert noxious effects. The Cr (VI) contamination in drinking water causes various hazardous health effects to the human health such as cancer, skin and stomach irritation or ulceration, dermatitis, damage to liver, kidney circulation and nerve tissue damage. Herein, an attempt has been done to develop an efficient adsorbent for the removal of Cr (VI) from water. For this purpose nanosorbent composed of polyvinyl alcohol functionalized graphene oxide (GO/PVA) was prepared. Thus, obtained GO/PVA was characterized through FTIR, XRD, SEM, and Raman Spectroscopy. As prepared nanosorbent of GO/PVA was utilized for the removal Cr (VI) in batch mode experiment. The process variables such as contact time, initial Cr (VI) concentration, pH, and temperature were optimized. The maximum 99.8 % removal of Cr (VI) was achieved at initial Cr (VI) concentration 60 mg/L, pH 2, temperature 35 °C and equilibrium was achieved within 50 min. The two widely used isotherm models viz. Langmuir and Freundlich were analyzed using linear correlation coefficient (R2) and it was found that Langmuir model gives best fit with high value of R2 for the data of present adsorption system which indicate the monolayer adsorption of Cr (VI) on the GO/PVA. Kinetic studies were also conducted using pseudo-first order and pseudo-second order models and it was observed that chemosorptive pseudo-second order model described the kinetics of current adsorption system in better way with high value of correlation coefficient. Thermodynamic studies were also conducted and results showed that the adsorption was spontaneous and endothermic in nature.Keywords: adsorption, GO/PVA, isotherm, kinetics, nanosorbent, thermodynamics
Procedia PDF Downloads 38919697 Design and Fabrication of an Electrostatically Actuated Parallel-Plate Mirror by 3D-Printer
Authors: J. Mizuno, S. Takahashi
Abstract:
In this paper, design and fabrication of an actuated parallel-plate mirror based on a 3D-printer is described. The mirror and electrode layers are fabricated separately and assembled thereafter. The alignment is performed by dowel pin-hole pairs fabricated on the respective layers. The electrodes are formed on the surface of the electrode layer by Au ion sputtering using a suitable mask, which is also fabricated by a 3D-printer.For grounding the mirror layer, except the contact area with the electrode paths, all the surface is Au ion sputtered. 3D-printers are widely used for creating 3D models or mock-ups. The authors have recently proposed that these models can perform electromechanical functions such as actuators by suitably masking them followed by metallization process. Since the smallest possible fabrication size is in the order of sub-millimeters, these electromechanical devices are named by the authors as SMEMS (Sub-Milli Electro-Mechanical Systems) devices. The proposed mirror described in this paper which consists of parallel-plate electrostatic actuators is also one type of SMEMS devices. In addition, SMEMS is totally environment-clean compared to MEMS (Micro Electro-Mechanical Systems) fabrication processes because any hazardous chemicals or gases are utilized.Keywords: MEMS, parallel-plate mirror, SMEMS, 3D-printer
Procedia PDF Downloads 43619696 Review of Numerical Models for Granular Beds in Solar Rotary Kilns for Thermal Applications
Authors: Edgar Willy Rimarachin Valderrama, Eduardo Rojas Parra
Abstract:
Thermal energy from solar radiation is widely present in power plants, food drying, chemical reactors, heating and cooling systems, water treatment processes, hydrogen production, and others. In the case of power plants, one of the technologies available to transform solar energy into thermal energy is by solar rotary kilns where a bed of granular matter is heated through concentrated radiation obtained from an arrangement of heliostats. Numerical modeling is a useful approach to study the behavior of granular beds in solar rotary kilns. This technique, once validated with small-scale experiments, can be used to simulate large-scale processes for industrial applications. This study gives a comprehensive classification of numerical models used to simulate the movement and heat transfer for beds of granular media within solar rotary furnaces. In general, there exist three categories of models: 1) continuum, 2) discrete, and 3) multiphysics modeling. The continuum modeling considers zero-dimensional, one-dimensional and fluid-like models. On the other hand, the discrete element models compute the movement of each particle of the bed individually. In this kind of modeling, the heat transfer acts during contacts, which can occur by solid-solid and solid-gas-solid conduction. Finally, the multiphysics approach considers discrete elements to simulate grains and a continuous modeling to simulate the fluid around particles. This classification allows to compare the advantages and disadvantages for each kind of model in terms of accuracy, computational cost and implementation.Keywords: granular beds, numerical models, rotary kilns, solar thermal applications
Procedia PDF Downloads 3319695 Combining the Dynamic Conditional Correlation and Range-GARCH Models to Improve Covariance Forecasts
Authors: Piotr Fiszeder, Marcin Fałdziński, Peter Molnár
Abstract:
The dynamic conditional correlation model of Engle (2002) is one of the most popular multivariate volatility models. However, this model is based solely on closing prices. It has been documented in the literature that the high and low price of the day can be used in an efficient volatility estimation. We, therefore, suggest a model which incorporates high and low prices into the dynamic conditional correlation framework. Empirical evaluation of this model is conducted on three datasets: currencies, stocks, and commodity exchange-traded funds. The utilisation of realized variances and covariances as proxies for true variances and covariances allows us to reach a strong conclusion that our model outperforms not only the standard dynamic conditional correlation model but also a competing range-based dynamic conditional correlation model.Keywords: volatility, DCC model, high and low prices, range-based models, covariance forecasting
Procedia PDF Downloads 183