Search results for: multi-criteria decision process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17578

Search results for: multi-criteria decision process

15658 The Effects of the Waste Plastic Modification of the Asphalt Mixture on the Permanent Deformation

Authors: Soheil Heydari, Ailar Hajimohammadi, Nasser Khalili

Abstract:

The application of plastic waste for asphalt modification is a sustainable strategy to deal with the enormous plastic waste generated each year and enhance the properties of asphalt. The modification is either practiced by the dry process or the wet process. In the dry process, plastics are added straight into the asphalt mixture, and in the wet process, they are mixed and digested into bitumen. In this article, the effects of plastic inclusion in asphalt mixture, through the dry process, on the permanent deformation of the asphalt are investigated. The main waste plastics that are usually used in asphalt modification are taken into account, which is linear, low-density polyethylene, low-density polyethylene, high-density polyethylene, and polypropylene. Also, to simulate a plastic waste stream, different grades of each virgin plastic are mixed and used. For instance, four different grades of polypropylene are mixed and used as representative of polypropylene. A precisely designed mixing condition is considered to dry-mix the plastics into the mixture such that the polymer was melted and modified by the later introduced binder. In this mixing process, plastics are first added to the hot aggregates and mixed three times in different time intervals, then bitumen is introduced, and the whole mixture is mixed three times in fifteen minutes intervals. Marshall specimens were manufactured, and dynamic creep tests were conducted to evaluate the effects of modification on the permanent deformation of the asphalt mixture. Dynamic creep is a common repeated loading test conducted at different stress levels and temperatures. Loading cycles are applied to the AC specimen until failure occurs; with the amount of deformation constantly recorded, the cumulative, permanent strain is determined and reported as a function of the number of cycles. The results of this study showed that the dry inclusion of the waste plastics is very effective in enhancing the resistance against permanent deformation of the mixture. However, the mixing process must be precisely engineered to melt the plastics, and a homogenous mixture is achieved.

Keywords: permanent deformation, waste plastics, low-density polyethene, high-density polyethene, polypropylene, linear low-density polyethene, dry process

Procedia PDF Downloads 72
15657 Multiparametric Optimization of Water Treatment Process for Thermal Power Plants

Authors: Balgaisha Mukanova, Natalya Glazyrina, Sergey Glazyrin

Abstract:

The formulated problem of optimization of the technological process of water treatment for thermal power plants is considered in this article. The problem is of multiparametric nature. To optimize the process, namely, reduce the amount of waste water, a new technology was developed to reuse such water. A mathematical model of the technology of wastewater reuse was developed. Optimization parameters were determined. The model consists of a material balance equation, an equation describing the kinetics of ion exchange for the non-equilibrium case and an equation for the ion exchange isotherm. The material balance equation includes a nonlinear term that depends on the kinetics of ion exchange. A direct problem of calculating the impurity concentration at the outlet of the water treatment plant was numerically solved. The direct problem was approximated by an implicit point-to-point computation difference scheme. The inverse problem was formulated as relates to determination of the parameters of the mathematical model of the water treatment plant operating in non-equilibrium conditions. The formulated inverse problem was solved. Following the results of calculation the time of start of the filter regeneration process was determined, as well as the period of regeneration process and the amount of regeneration and wash water. Multi-parameter optimization of water treatment process for thermal power plants allowed decreasing the amount of wastewater by 15%.

Keywords: direct problem, multiparametric optimization, optimization parameters, water treatment

Procedia PDF Downloads 373
15656 Advanced Digital Manufacturing: Case Study

Authors: Abdelrahman Abdelazim

Abstract:

Most industries are looking for technologies that are easy to use, efficient and fast to accomplish. To implement these, factories tend to use advanced systems that could alter complicity to simplicity and rudimentary to advancement. Cloud Manufacturing is a new movement that aims to mirror and integrate cloud computing into manufacturing. Amongst cloud manufacturing various advantages are decreasing the human involvements and increasing the dependency on automated machines, which in turns decreases human errors and increases efficiency. A reliable and extraordinary performance processes with minimum errors are highly desired factors of today’s manufacturers. At the glance it seems to be the best alternative, however, the implementation of a cloud system can be very challenging. This work investigates cloud manufacturing in details, it outlines its advantages and disadvantages by converting a local factory in Kuwait to a cloud-ready system. Initially the flow of the factory’s manufacturing process has been analyzed identifying the bottlenecks and illustrating how cloud manufacturing can eliminate them. Following this an automation process has been analyzed and implemented. A comparison between the process before and after the adaptation has been carried out showing the effects on the cost, the output and the efficiency of the process.

Keywords: cloud manufacturing, automation, Kuwait industrial sector, advanced digital manufacturing

Procedia PDF Downloads 759
15655 Big Data Applications for Transportation Planning

Authors: Antonella Falanga, Armando Cartenì

Abstract:

"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning

Procedia PDF Downloads 46
15654 Improved Pattern Matching Applied to Surface Mounting Devices Components Localization on Automated Optical Inspection

Authors: Pedro M. A. Vitoriano, Tito. G. Amaral

Abstract:

Automated Optical Inspection (AOI) Systems are commonly used on Printed Circuit Boards (PCB) manufacturing. The use of this technology has been proven as highly efficient for process improvements and quality achievements. The correct extraction of the component for posterior analysis is a critical step of the AOI process. Nowadays, the Pattern Matching Algorithm is commonly used, although this algorithm requires extensive calculations and is time consuming. This paper will present an improved algorithm for the component localization process, with the capability of implementation in a parallel execution system.

Keywords: AOI, automated optical inspection, SMD, surface mounting devices, pattern matching, parallel execution

Procedia PDF Downloads 289
15653 An Evaluation of Drivers in Implementing Sustainable Manufacturing in India: Using DEMATEL Approach

Authors: D. Garg, S. Luthra, A. Haleem

Abstract:

Due to growing concern about environmental and social consequences throughout the world, a need has been felt to incorporate sustainability concepts in conventional manufacturing. This paper is an attempt to identify and evaluate drivers in implementing sustainable manufacturing in Indian context. Nine possible drivers for successful implementation of sustainable manufacturing have been identified from extensive review. Further, Decision Making Trial and Evaluation Laboratory (DEMATEL) approach has been utilized to evaluate and categorize these identified drivers for implementing sustainable manufacturing in to the cause and effect groups. Five drivers (Societal Pressure and Public Concerns; Regulations and Government Policies; Top Management Involvement, Commitment and Support; Effective Strategies and Activities towards Socially Responsible Manufacturing and Market Trends) have been categorized into the cause group and four drivers (Holistic View in Manufacturing Systems; Supplier Participation; Building Sustainable culture in Organization; and Corporate Image and Benefits) have been categorized into the effect group. “Societal Pressure and Public Concerns” has been found the most critical driver and “Corporate Image and Benefits” as least critical or the most easily influenced driver to implementing sustainable manufacturing in Indian context. This paper may surely help practitioners in better understanding of these drivers and their priorities towards effective implementation of sustainable manufacturing.

Keywords: drivers, decision making trial and evaluation laboratory (DEMATEL), India, sustainable manufacturing

Procedia PDF Downloads 371
15652 e-Learning Security: A Distributed Incident Response Generator

Authors: Bel G Raggad

Abstract:

An e-Learning setting is a distributed computing environment where information resources can be connected to any public network. Public networks are very unsecure which can compromise the reliability of an e-Learning environment. This study is only concerned with the intrusion detection aspect of e-Learning security and how incident responses are planned. The literature reported great advances in intrusion detection system (ids) but neglected to study an important ids weakness: suspected events are detected but an intrusion is not determined because it is not defined in ids databases. We propose an incident response generator (DIRG) that produces incident responses when the working ids system suspects an event that does not correspond to a known intrusion. Data involved in intrusion detection when ample uncertainty is present is often not suitable to formal statistical models including Bayesian. We instead adopt Dempster and Shafer theory to process intrusion data for the unknown event. The DIRG engine transforms data into a belief structure using incident scenarios deduced by the security administrator. Belief values associated with various incident scenarios are then derived and evaluated to choose the most appropriate scenario for which an automatic incident response is generated. This article provides a numerical example demonstrating the working of the DIRG system.

Keywords: decision support system, distributed computing, e-Learning security, incident response, intrusion detection, security risk, statefull inspection

Procedia PDF Downloads 420
15651 The Impact of the Interest Rates on Investments in the Context of Financial Crisis

Authors: Joanna Stawska

Abstract:

The main objective of this article is to examine the impact of interest rates on investments in Poland in the context of financial crisis. The paper also investigates the dependence of bank loans to enterprises on interbank market rates. The article studies the impact of interbank market rate on the level of investments in Poland. Besides, this article focuses on the research of the correlation between the level of corporate loans and the amount of investments in Poland in order to determine the indirect impact of central bank interest rates through the transmission mechanism of monetary policy on the real economy. To achieve the objective we have used econometric and statistical research methods like: econometric model and Pearson correlation coefficient. This analysis suggests that the central bank reference rate inversely proportionally affects the level of investments in Poland and this dependence is moderate. This is also important issue because it is related to preparing of Poland to accession to euro area. The research is important from both theoretical and empirical points of view. The formulated conclusions and recommendations determine the practical significance of the paper which may be used in the decision making process of monetary and economic authorities of the country.

Keywords: central bank, financial crisis, interest rate, investments

Procedia PDF Downloads 418
15650 Silicon Carbide (SiC) Crystallization Obtained as a Side Effect of SF6 Etching Process

Authors: N. K. A. M. Galvão, A. Godoy Jr., A. L. J. Pereira, G. V. Martins, R. S. Pessoa, H. S. Maciel, M. A. Fraga

Abstract:

Silicon carbide (SiC) is a wide band-gap semiconductor material with very attractive properties, such as high breakdown voltage, chemical inertness, and high thermal and electrical stability, which makes it a promising candidate for several applications, including microelectromechanical systems (MEMS) and electronic devices. In MEMS manufacturing, the etching process is an important step. It has been proved that wet etching of SiC is not feasible due to its high bond strength and high chemical inertness. In view of this difficulty, the plasma etching technique has been applied with paramount success. However, in most of these studies, only the determination of the etching rate and/or morphological characterization of SiC, as well as the analysis of the reactive ions present in the plasma, are lowly explored. There is a lack of results in the literature on the chemical and structural properties of SiC after the etching process [4]. In this work, we investigated the etching process of sputtered amorphous SiC thin films on Si substrates in a reactive ion etching (RIE) system using sulfur hexafluoride (SF6) gas under different RF power. The results of the chemical and structural analyses of the etched films revealed that, for all conditions, a SiC crystallization occurred, in addition to fluoride contamination. In conclusion, we observed that SiC crystallization is a side effect promoted by structural, morphological and chemical changes caused by RIE SF6 etching process.

Keywords: plasma etching, plasma deposition, Silicon Carbide, microelectromechanical systems

Procedia PDF Downloads 64
15649 Developing a Moodle Course for Translation Theory and Methodology: The Importance of Theory in Translation Studies and Its Application

Authors: Antonia Tsaknaki

Abstract:

There are many and divergent views on how the science of translation should be taught in academic institutions or colleges, meaning as an independent study area or as part of Linguistics, Literature or Foreign Languages Departments. A much more debated issue refers to the question of whether translation theory should be included in syllabuses and study programs or the focus should be solely on practicing the profession, that is translating texts. This dissertation examines prevailing views on the significance of translation theory in translation studies in order to design an open course on moodle. Taking into account that there is a remarkable percentage of translation professionals who are self-taught without having any specific studies, the course aims at helping either translation students or professional translators familiarize with concepts, methods and problem-solving strategies that are considered necessary during the process. It is organized in four modules where the learner is guided through a series of topics (register, equivalence, decision-making, level of naturalness, Skopos theory etc); after completing these topics, they are given assignments (further reading) and texts to work on in order to practice the skills obtained. The course does not focus on a specific language pair and therefore is suitable for every individual who needs a theoretical background to boost their performance or for institutions seeking to save classroom time but not at the expense of learners’ skills.

Keywords: MOOCs, moodle, online learning, open courses, translation, translation theory

Procedia PDF Downloads 127
15648 IoT Continuous Monitoring Biochemical Oxygen Demand Wastewater Effluent Quality: Machine Learning Algorithms

Authors: Sergio Celaschi, Henrique Canavarro de Alencar, Claaudecir Biazoli

Abstract:

Effluent quality is of the highest priority for compliance with the permit limits of environmental protection agencies and ensures the protection of their local water system. Of the pollutants monitored, the biochemical oxygen demand (BOD) posed one of the greatest challenges. This work presents a solution for wastewater treatment plants - WWTP’s ability to react to different situations and meet treatment goals. Delayed BOD5 results from the lab take 7 to 8 analysis days, hindered the WWTP’s ability to react to different situations and meet treatment goals. Reducing BOD turnaround time from days to hours is our quest. Such a solution is based on a system of two BOD bioreactors associated with Digital Twin (DT) and Machine Learning (ML) methodologies via an Internet of Things (IoT) platform to monitor and control a WWTP to support decision making. DT is a virtual and dynamic replica of a production process. DT requires the ability to collect and store real-time sensor data related to the operating environment. Furthermore, it integrates and organizes the data on a digital platform and applies analytical models allowing a deeper understanding of the real process to catch sooner anomalies. In our system of continuous time monitoring of the BOD suppressed by the effluent treatment process, the DT algorithm for analyzing the data uses ML on a chemical kinetic parameterized model. The continuous BOD monitoring system, capable of providing results in a fraction of the time required by BOD5 analysis, is composed of two thermally isolated batch bioreactors. Each bioreactor contains input/output access to wastewater sample (influent and effluent), hydraulic conduction tubes, pumps, and valves for batch sample and dilution water, air supply for dissolved oxygen (DO) saturation, cooler/heater for sample thermal stability, optical ODO sensor based on fluorescence quenching, pH, ORP, temperature, and atmospheric pressure sensors, local PLC/CPU for TCP/IP data transmission interface. The dynamic BOD system monitoring range covers 2 mg/L < BOD < 2,000 mg/L. In addition to the BOD monitoring system, there are many other operational WWTP sensors. The CPU data is transmitted/received to/from the digital platform, which in turn performs analyses at periodic intervals, aiming to feed the learning process. BOD bulletins and their credibility intervals are made available in 12-hour intervals to web users. The chemical kinetics ML algorithm is composed of a coupled system of four first-order ordinary differential equations for the molar masses of DO, organic material present in the sample, biomass, and products (CO₂ and H₂O) of the reaction. This system is solved numerically linked to its initial conditions: DO (saturated) and initial products of the kinetic oxidation process; CO₂ = H₂0 = 0. The initial values for organic matter and biomass are estimated by the method of minimization of the mean square deviations. A real case of continuous monitoring of BOD wastewater effluent quality is being conducted by deploying an IoT application on a large wastewater purification system located in S. Paulo, Brazil.

Keywords: effluent treatment, biochemical oxygen demand, continuous monitoring, IoT, machine learning

Procedia PDF Downloads 61
15647 Energy Efficient Retrofitting and Optimization of Dual Mixed Refrigerant Natural Gas Liquefaction Process

Authors: Muhammad Abdul Qyyum, Kinza Qadeer, Moonyong Lee

Abstract:

Globally, liquefied natural gas (LNG) has drawn interest as a green energy source in comparison with other fossil fuels, mainly because of its ease of transport and low carbon dioxide emissions. It is expected that demand for LNG will grow steadily over the next few decades. In addition, because the demand for clean energy is increasing, LNG production facilities are expanding into new natural gas reserves across the globe. However, LNG production is an energy and cost intensive process because of the huge power requirements for compression and refrigeration. Therefore, one of the major challenges in the LNG industry is to improve the energy efficiency of existing LNG processes through economic and ecological strategies. The advancement in expansion devices such as two-phase cryogenic expander (TPE) and cryogenic hydraulic turbine (HT) were exploited for energy and cost benefits in natural gas liquefaction. Retrofitting the conventional Joule–Thompson (JT) valve with TPE and HT have the potential to improve the energy efficiency of LNG processes. This research investigated the potential feasibility of the retrofitting of a dual mixed refrigerant (DMR) process by replacing the isenthalpic expansion with isentropic expansion corresponding to energy efficient LNG production. To fully take the potential benefit of the proposed process retrofitting, the proposed DMR schemes were optimized by using a Coggins optimization approach, which was implemented in Microsoft Visual Studio (MVS) environment and linked to the rigorous HYSYS® model. The results showed that the required energy of the proposed isentropic expansion based DMR process could be saved up to 26.5% in comparison with the conventional isenthalpic based DMR process using the JT valves. Utilization of the recovered energy into boosting the natural gas feed pressure could further improve the energy efficiency of the LNG process up to 34% as compared to the base case. This work will help the process engineers to overcome the challenges relating to energy efficiency and safety concerns of LNG processes. Furthermore, the proposed retrofitting scheme can also be implemented to improve the energy efficiency of other isenthalpic expansion based energy intensive cryogenic processes.

Keywords: cryogenic liquid turbine, Coggins optimization, dual mixed refrigerant, energy efficient LNG process, two-phase expander

Procedia PDF Downloads 134
15646 Integrating Process Planning, WMS Dispatching, and WPPW Weighted Due Date Assignment Using a Genetic Algorithm

Authors: Halil Ibrahim Demir, Tarık Cakar, Ibrahim Cil, Muharrem Dugenci, Caner Erden

Abstract:

Conventionally, process planning, scheduling, and due-date assignment functions are performed separately and sequentially. The interdependence of these functions requires integration. Although integrated process planning and scheduling, and scheduling with due date assignment problems are popular research topics, only a few works address the integration of these three functions. This work focuses on the integration of process planning, WMS scheduling, and WPPW due date assignment. Another novelty of this work is the use of a weighted due date assignment. In the literature, due dates are generally assigned without considering the importance of customers. However, in this study, more important customers get closer due dates. Typically, only tardiness is punished, but the JIT philosophy punishes both earliness and tardiness. In this study, all weighted earliness, tardiness, and due date related costs are penalized. As no customer desires distant due dates, such distant due dates should be penalized. In this study, various levels of integration of these three functions are tested and genetic search and random search are compared both with each other and with ordinary solutions. Higher integration levels are superior, while search is always useful. Genetic searches outperformed random searches.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic algorithm, random search

Procedia PDF Downloads 379
15645 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions

Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes

Abstract:

The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.

Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning

Procedia PDF Downloads 55
15644 Design of Multi-Loop Controller for Minimization of Energy Consumption in the Distillation Column

Authors: Vinayambika S. Bhat, S. Shanmuga Priya, I. Thirunavukkarasu, Shreeranga Bhat

Abstract:

An attempt has been made to design a decoupling controller for systems with more inputs more outputs with dead time in it. The de-coupler is designed for the chemical process industry 3×3 plant transfer function with dead time. The Quantitative Feedback Theory (QFT) based controller has also been designed here for the 2×2 distillation column transfer function. The developed control techniques were simulated using the MATLAB/Simulink. Also, the stability of the process was analyzed, together with the presence of various perturbations in it. Time domain specifications like setting time along with overshoot and oscillations were analyzed to prove the efficiency of the de-coupler method. The load disturbance rejection was tested along with its performance. The QFT control technique was synthesized based on the stability and performance specifications in the presence of uncertainty in time constant of the plant transfer function through sequential loop shaping technique. Further, the energy efficiency of the distillation column was improved by proper tuning of the controller. A distillation column consumes 3% of the total energy consumption of the world. A suitable control technique is very important from an economic point of view. The real time implementation of the process is under process in our laboratory.

Keywords: distillation, energy, MIMO process, time delay, robust stability

Procedia PDF Downloads 397
15643 Performance Assessment of Multi-Level Ensemble for Multi-Class Problems

Authors: Rodolfo Lorbieski, Silvia Modesto Nassar

Abstract:

Many supervised machine learning tasks require decision making across numerous different classes. Multi-class classification has several applications, such as face recognition, text recognition and medical diagnostics. The objective of this article is to analyze an adapted method of Stacking in multi-class problems, which combines ensembles within the ensemble itself. For this purpose, a training similar to Stacking was used, but with three levels, where the final decision-maker (level 2) performs its training by combining outputs from the tree-based pair of meta-classifiers (level 1) from Bayesian families. These are in turn trained by pairs of base classifiers (level 0) of the same family. This strategy seeks to promote diversity among the ensembles forming the meta-classifier level 2. Three performance measures were used: (1) accuracy, (2) area under the ROC curve, and (3) time for three factors: (a) datasets, (b) experiments and (c) levels. To compare the factors, ANOVA three-way test was executed for each performance measure, considering 5 datasets by 25 experiments by 3 levels. A triple interaction between factors was observed only in time. The accuracy and area under the ROC curve presented similar results, showing a double interaction between level and experiment, as well as for the dataset factor. It was concluded that level 2 had an average performance above the other levels and that the proposed method is especially efficient for multi-class problems when compared to binary problems.

Keywords: stacking, multi-layers, ensemble, multi-class

Procedia PDF Downloads 257
15642 Study of Machinability for Titanium Alloy Ti-6Al-4V through Chip Formation in Milling Process

Authors: Moaz H. Ali, Ahmed H. Al-Saadi

Abstract:

Most of the materials used in the industry of aero-engine components generally consist of titanium alloys. Advanced materials, because of their excellent combination of high specific strength, lightweight, and general corrosion resistance. In fact, chemical wear resistance of aero-engine alloy provide a serious challenge for cutting tool material during the machining process. The reduction in cutting temperature distributions leads to an increase in tool life and a decrease in wear rate. Hence, the chip morphology and segmentation play a predominant role in determining machinability and tool wear during the machining process. The result of low thermal conductivity and diffusivity of this alloy in the concentration of high temperatures at the tool-work-piece and tool-chip interface. Consequently, the chip morphology is very important in the study of machinability of metals as well as the study of cutting tool wear. Otherwise, the result will be accelerating tool wear, increasing manufacturing cost and time consuming.

Keywords: machinability, titanium alloy (ti-6al-4v), chip formation, milling process

Procedia PDF Downloads 429
15641 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State

Authors: Hamisu Idi

Abstract:

The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.

Keywords: cement, quality, variation, assignable cause, common cause

Procedia PDF Downloads 246
15640 The Proposal for a Framework to Face Opacity and Discrimination ‘Sins’ Caused by Consumer Creditworthiness Machines in the EU

Authors: Diogo José Morgado Rebelo, Francisco António Carneiro Pacheco de Andrade, Paulo Jorge Freitas de Oliveira Novais

Abstract:

Not everything in AI-power consumer credit scoring turns out to be a wonder. When using AI in Creditworthiness Assessment (CWA), opacity and unfairness ‘sins’ must be considered to the task be deemed Responsible. AI software is not always 100% accurate, which can lead to misclassification. Discrimination of some groups can be exponentiated. A hetero personalized identity can be imposed on the individual(s) affected. Also, autonomous CWA sometimes lacks transparency when using black box models. However, for this intended purpose, human analysts ‘on-the-loop’ might not be the best remedy consumers are looking for in credit. This study seeks to explore the legality of implementing a Multi-Agent System (MAS) framework in consumer CWA to ensure compliance with the regulation outlined in Article 14(4) of the Proposal for an Artificial Intelligence Act (AIA), dated 21 April 2021 (as per the last corrigendum by the European Parliament on 19 April 2024), Especially with the adoption of Art. 18(8)(9) of the EU Directive 2023/2225, of 18 October, which will go into effect on 20 November 2026, there should be more emphasis on the need for hybrid oversight in AI-driven scoring to ensure fairness and transparency. In fact, the range of EU regulations on AI-based consumer credit will soon impact the AI lending industry locally and globally, as shown by the broad territorial scope of AIA’s Art. 2. Consequently, engineering the law of consumer’s CWA is imperative. Generally, the proposed MAS framework consists of several layers arranged in a specific sequence, as follows: firstly, the Data Layer gathers legitimate predictor sets from traditional sources; then, the Decision Support System Layer, whose Neural Network model is trained using k-fold Cross Validation, provides recommendations based on the feeder data; the eXplainability (XAI) multi-structure comprises Three-Step-Agents; and, lastly, the Oversight Layer has a 'Bottom Stop' for analysts to intervene in a timely manner. From the analysis, one can assure a vital component of this software is the XAY layer. It appears as a transparent curtain covering the AI’s decision-making process, enabling comprehension, reflection, and further feasible oversight. Local Interpretable Model-agnostic Explanations (LIME) might act as a pillar by offering counterfactual insights. SHapley Additive exPlanation (SHAP), another agent in the XAI layer, could address potential discrimination issues, identifying the contribution of each feature to the prediction. Alternatively, for thin or no file consumers, the Suggestion Agent can promote financial inclusion. It uses lawful alternative sources such as the share of wallet, among others, to search for more advantageous solutions to incomplete evaluation appraisals based on genetic programming. Overall, this research aspires to bring the concept of Machine-Centered Anthropocentrism to the table of EU policymaking. It acknowledges that, when put into service, credit analysts no longer exert full control over the data-driven entities programmers have given ‘birth’ to. With similar explanatory agents under supervision, AI itself can become self-accountable, prioritizing human concerns and values. AI decisions should not be vilified inherently. The issue lies in how they are integrated into decision-making and whether they align with non-discrimination principles and transparency rules.

Keywords: creditworthiness assessment, hybrid oversight, machine-centered anthropocentrism, EU policymaking

Procedia PDF Downloads 21
15639 Parkinson’s Disease Detection Analysis through Machine Learning Approaches

Authors: Muhtasim Shafi Kader, Fizar Ahmed, Annesha Acharjee

Abstract:

Machine learning and data mining are crucial in health care, as well as medical information and detection. Machine learning approaches are now being utilized to improve awareness of a variety of critical health issues, including diabetes detection, neuron cell tumor diagnosis, COVID 19 identification, and so on. Parkinson’s disease is basically a disease for our senior citizens in Bangladesh. Parkinson's Disease indications often seem progressive and get worst with time. People got affected trouble walking and communicating with the condition advances. Patients can also have psychological and social vagaries, nap problems, hopelessness, reminiscence loss, and weariness. Parkinson's disease can happen in both men and women. Though men are affected by the illness at a proportion that is around partial of them are women. In this research, we have to get out the accurate ML algorithm to find out the disease with a predictable dataset and the model of the following machine learning classifiers. Therefore, nine ML classifiers are secondhand to portion study to use machine learning approaches like as follows, Naive Bayes, Adaptive Boosting, Bagging Classifier, Decision Tree Classifier, Random Forest classifier, XBG Classifier, K Nearest Neighbor Classifier, Support Vector Machine Classifier, and Gradient Boosting Classifier are used.

Keywords: naive bayes, adaptive boosting, bagging classifier, decision tree classifier, random forest classifier, XBG classifier, k nearest neighbor classifier, support vector classifier, gradient boosting classifier

Procedia PDF Downloads 117
15638 Characterization of Waste Thermocol Modified Bitumen by Spectroscopy, Microscopic Technique, and Dynamic Shear Rheometer

Authors: Supriya Mahida, Sangita, Yogesh U. Shah, Shanta Kumar

Abstract:

The global production of thermocol increasing day by day, due to vast applications of the use of thermocole in many sectors. Thermocol being non-biodegradable and more toxic than plastic leads towards a number of problems like its management into value-added products, environmental damage and landfill problems due to weight to volume ratio. Utilization of waste thermocol for modification of bitumen binders resulted in waste thermocol modified bitumen (WTMB) used in road construction and maintenance technology. Modification of bituminous mixes through incorporating thermocol into bituminous mixes through a dry process is one of the new options besides recycling process which consumes lots of waste thermocol. This process leads towards waste management and remedies against thermocol waste disposal. The present challenge is to dispose the thermocol waste under different forms in road infrastructure, either through the dry process or wet process to be developed in future. This paper focuses on the use of thermocol wastes which is mixed with VG 10 bitumen in proportions of 0.5%, 1%, 1.5%, and 2% by weight of bitumen. The physical properties of neat bitumen are evaluated and compared with modified VG 10 bitumen having thermocol. Empirical characterization like penetration, softening, and viscosity of bitumen has been carried out. Thermocol and waste thermocol modified bitumen (WTMB) were further analyzed by Fourier Transform Infrared Spectroscopy (FT-IR), field emission scanning electron microscopy (FESEM), and Dynamic Shear Rheometer (DSR).

Keywords: DSR, FESEM, FT-IR, thermocol wastes

Procedia PDF Downloads 152
15637 Understanding the Conflict Between Ecological Environment and Human Activities in the Process of Urbanization

Authors: Yazhou Zhou, Yong Huang, Guoqin Ge

Abstract:

In the process of human social development, the coupling and coordinated development among the ecological environment(E), production(P), and living functions(L) is of great significance for sustainable development. This study uses an improved coupling coordination degree model (CCDM) to discover the coordination conflict between E and human settlement environment. The main work of this study is as follows: (1) It is found that in the process of urbanization development of Ya 'an city from 2014 to 2018, the degree of coupling (DOC) value between E, P, and L is high, but the coupling coordination degree (CCD) of the three is low, especially the DOC value of E and the other two has the biggest decline. (2) A more objective weight value is obtained, which can avoid the analysis error caused by subjective judgment weight value.

Keywords: ecological environment, coupling coordination degree, neural network, sustainable development

Procedia PDF Downloads 57
15636 Torrefaction of Biomass Pellets: Modeling of the Process in a Fixed Bed Reactor

Authors: Ekaterina Artiukhina, Panagiotis Grammelis

Abstract:

Torrefaction of biomass pellets is considered as a useful pretreatment technology in order to convert them into a high quality solid biofuel that is more suitable for pyrolysis, gasification, combustion and co-firing applications. In the course of torrefaction the temperature varies across the pellet, and therefore chemical reactions proceed unevenly within the pellet. However, the uniformity of the thermal distribution along the pellet is generally assumed. The torrefaction process of a single cylindrical pellet is modeled here, accounting for heat transfer coupled with chemical kinetics. The drying sub-model was also introduced. The non-stationary process of wood pellet decomposition is described by the system of non-linear partial differential equations over the temperature and mass. The model captures well the main features of the experimental data.

Keywords: torrefaction, biomass pellets, model, heat, mass transfer

Procedia PDF Downloads 466
15635 Breakthrough Innovation Thinking Technology of a Conglomerate for Next Generation Plan

Authors: Dongkyu Lee, Doan-Quoc Hoan, Soomi Shin

Abstract:

The purpose of this study is to suggest the Value Innovation type Breakthrough Innovation which is a Big Thinking Process that realizes a creative idea for the next generation innovation Master Plan of a company. The BI based on the PVI methodology is believed to contribute to the launching of a new business, the acquisition of new markets, and the development of an innovative management process.

Keywords: value, innovation, breakthrough innovation, Korean firm

Procedia PDF Downloads 574
15634 Multi-Point Dieless Forming Product Defect Reduction Using Reliability-Based Robust Process Optimization

Authors: Misganaw Abebe Baye, Ji-Woo Park, Beom-Soo Kang

Abstract:

The product quality of multi-point dieless forming (MDF) is identified to be dependent on the process parameters. Moreover, a certain variation of friction and material properties may have a substantially worse influence on the final product quality. This study proposed on how to compensate the MDF product defects by minimizing the sensitivity of noise parameter variations. This can be attained by reliability-based robust optimization (RRO) technique to obtain the optimal process setting of the controllable parameters. Initially two MDF Finite Element (FE) simulations of AA3003-H14 saddle shape showed a substantial amount of dimpling, wrinkling, and shape error. FE analyses are consequently applied on ABAQUS commercial software to obtain the correlation between the control process setting and noise variation with regard to the product defects. The best prediction models are chosen from the family of metamodels to swap the computational expensive FE simulation. Genetic algorithm (GA) is applied to determine the optimal process settings of the control parameters. Monte Carlo Analysis (MCA) is executed to determine how the noise parameter variation affects the final product quality. Finally, the RRO FE simulation and the experimental result show that the amendment of the control parameters in the final forming process leads to a considerably better-quality product.

Keywords: dimpling, multi-point dieless forming, reliability-based robust optimization, shape error, variation, wrinkling

Procedia PDF Downloads 235
15633 Envisioning Process in Medium Enterprises: An Exploratory Study of Cambodian Living Arts

Authors: Alexandre Bédard, Caroline Coulombe, Jonathan Harvey

Abstract:

Envisioning process (EP) in medium enterprises is treated equally in very small enterprises. Building on the concept of social construction, this study aims to explore how envisioning is constructed in a medium enterprise in which stakeholders are involved and how it is influenced. We use a unique case method based on qualitative data collected through 11 interviews representing various members of the organization. Through the discussion of the findings, we were able to confirm the social construction of the EP and to identify three main stakeholders responsible for the construction of the vision, mainly political and social powers, actors of the organization, and financial providers. Moreover, EP is influenced by external factors; in this case, the history of the organization and the value and importance of the art and the culture for Cambodians.

Keywords: envisioning process, social constructivism, medium enterprise, legitimacy

Procedia PDF Downloads 96
15632 A Comparative Analysis of Solid Waste Treatment Technologies on Cost and Environmental Basis

Authors: Nesli Aydin

Abstract:

Waste management decision making in developing countries has moved towards being more pragmatic, transparent, sustainable and comprehensive. Turkey is required to make its waste related legislation compatible with European Legislation as it is a candidate country of the European Union. Improper Turkish practices such as open burning and open dumping practices must be abandoned urgently, and robust waste management systems have to be structured. The determination of an optimum waste management system in any region requires a comprehensive analysis in which many criteria are taken into account by stakeholders. In conducting this sort of analysis, there are two main criteria which are evaluated by waste management analysts; economic viability and environmentally friendliness. From an analytical point of view, a central characteristic of sustainable development is an economic-ecological integration. It is predicted that building a robust waste management system will need significant effort and cooperation between the stakeholders in developing countries such as Turkey. In this regard, this study aims to provide data regarding the cost and environmental burdens of waste treatment technologies such as an incinerator, an autoclave (with different capacities), a hydroclave and a microwave coupled with updated information on calculation methods, and a framework for comparing any proposed scenario performances on a cost and environmental basis.

Keywords: decision making, economic viability, environmentally friendliness, waste management systems

Procedia PDF Downloads 297
15631 Effect of the Initial Billet Shape Parameters on the Final Product in a Backward Extrusion Process for Pressure Vessels

Authors: Archana Thangavelu, Han-Ik Park, Young-Chul Park, Joon-Hong Park

Abstract:

In this numerical study, we have proposed a method for evaluation of backward extrusion process of pressure vessel made up of steel. Demand for lighter and stiffer products have been increasing in the last years especially in automobile engineering. Through detailed finite element analysis, effective stress, strain and velocity profile have been obtained with optimal range. The process design of a forward and backward extrusion axe-symmetric part has been studied. Forging is mainly carried out because forged products are highly reliable and possess superior mechanical properties when compared to normal products. Performing computational simulations of 3D hot forging with various dimensions of billet and optimization of weight is carried out using Taguchi Orthogonal Array (OA) Optimization technique. The technique used in this study can be used for newly developed materials to investigate its forgeability for much complicated shapes in closed hot die forging process.

Keywords: backward extrusion, hot forging, optimization, finite element analysis, Taguchi method

Procedia PDF Downloads 295
15630 Adversarial Attacks and Defenses on Deep Neural Networks

Authors: Jonathan Sohn

Abstract:

Deep neural networks (DNNs) have shown state-of-the-art performance for many applications, including computer vision, natural language processing, and speech recognition. Recently, adversarial attacks have been studied in the context of deep neural networks, which aim to alter the results of deep neural networks by modifying the inputs slightly. For example, an adversarial attack on a DNN used for object detection can cause the DNN to miss certain objects. As a result, the reliability of DNNs is undermined by their lack of robustness against adversarial attacks, raising concerns about their use in safety-critical applications such as autonomous driving. In this paper, we focus on studying the adversarial attacks and defenses on DNNs for image classification. There are two types of adversarial attacks studied which are fast gradient sign method (FGSM) attack and projected gradient descent (PGD) attack. A DNN forms decision boundaries that separate the input images into different categories. The adversarial attack slightly alters the image to move over the decision boundary, causing the DNN to misclassify the image. FGSM attack obtains the gradient with respect to the image and updates the image once based on the gradients to cross the decision boundary. PGD attack, instead of taking one big step, repeatedly modifies the input image with multiple small steps. There is also another type of attack called the target attack. This adversarial attack is designed to make the machine classify an image to a class chosen by the attacker. We can defend against adversarial attacks by incorporating adversarial examples in training. Specifically, instead of training the neural network with clean examples, we can explicitly let the neural network learn from the adversarial examples. In our experiments, the digit recognition accuracy on the MNIST dataset drops from 97.81% to 39.50% and 34.01% when the DNN is attacked by FGSM and PGD attacks, respectively. If we utilize FGSM training as a defense method, the classification accuracy greatly improves from 39.50% to 92.31% for FGSM attacks and from 34.01% to 75.63% for PGD attacks. To further improve the classification accuracy under adversarial attacks, we can also use a stronger PGD training method. PGD training improves the accuracy by 2.7% under FGSM attacks and 18.4% under PGD attacks over FGSM training. It is worth mentioning that both FGSM and PGD training do not affect the accuracy of clean images. In summary, we find that PGD attacks can greatly degrade the performance of DNNs, and PGD training is a very effective way to defend against such attacks. PGD attacks and defence are overall significantly more effective than FGSM methods.

Keywords: deep neural network, adversarial attack, adversarial defense, adversarial machine learning

Procedia PDF Downloads 177
15629 Knowledge Diffusion via Automated Organizational Cartography (Autocart)

Authors: Mounir Kehal

Abstract:

The post-globalization epoch has placed businesses everywhere in new and different competitive situations where knowledgeable, effective and efficient behavior has come to provide the competitive and comparative edge. Enterprises have turned to explicit - and even conceptualizing on tacit - knowledge management to elaborate a systematic approach to develop and sustain the intellectual capital needed to succeed. To be able to do that, you have to be able to visualize your organization as consisting of nothing but knowledge and knowledge flows, whilst being presented in a graphical and visual framework, referred to as automated organizational cartography. Hence, creating the ability of further actively classifying existing organizational content evolving from and within data feeds, in an algorithmic manner, potentially giving insightful schemes and dynamics by which organizational know-how is visualized. It is discussed and elaborated on most recent and applicable definitions and classifications of knowledge management, representing a wide range of views from mechanistic (systematic, data driven) to a more socially (psychologically, cognitive/metadata driven) orientated. More elaborate continuum models, for knowledge acquisition and reasoning purposes, are being used for effectively representing the domain of information that an end user may contain in their decision making process for utilization of available organizational intellectual resources (i.e. Autocart). In this paper, we present an empirical research study conducted previously to try and explore knowledge diffusion in a specialist knowledge domain.

Keywords: knowledge management, knowledge maps, knowledge diffusion, organizational cartography

Procedia PDF Downloads 291