Search results for: Markov decision process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17824

Search results for: Markov decision process

16894 The Current Level of Shared Decision-Making in Head-And-Neck Oncology: An Exploratory Study – Preliminary Results

Authors: Anne N. Heirman, Song Duimel, Rob van Son, Lisette van der Molen, Richard Dirven, Gyorgi B. Halmos, Julia van Weert, Michiel W.M. van den Brekel

Abstract:

Objectives: Treatments for head-neck cancer are drastic and often significantly impact the quality of life and appearance of patients. Shared decision-making (SDM) beholds a collaboration between patient and doctor in which the most suitable treatment can be chosen by integrating patient preferences, values, and medical information. SDM has a lot of advantages that would be useful in making difficult treatment choices. The objective of this study was to determine the current level of SDM among patients and head-and-neck surgeons. Methods: Consultations of patients with a non-cutaneous head-and-neck malignancy facing a treatment decision were selected and included. If given informed consent, the consultation was recorded with an audio recorder, and the patient and surgeon filled in a questionnaire immediately after the consultation. The SDM level of the consultation was scored objectively by independent observers who judged audio recordings of the consultation using the OPTION5-scale, ranging from 0% (no SDM) to 100% (optimum SDM), as well as subjectively by patients (using the SDM-Q-9 and Control preference scale) and clinicians (SDM-Q-Doc, modified control preference scale) percentages. Preliminary results: Five head-neck surgeons have each at least seven recorded conversations with different patients. One of them was trained in SDM. The other four had no experience with SDM. Most patients were male (74%), and oropharyngeal carcinoma was the most common diagnosis (41%), followed by oral cancer (33%). Five patients received palliative treatment of which two patients were not treated recording guidelines. At this moment, all recordings are scored by the two independent observers. Analysis of the results will follow soon. Conclusion: The current study will determine to what extent there is a discrepancy between the objective and subjective level of shared decision-making (SDM) during a doctor-patient consultation in Head-and-Neck surgery. The results of the analysis will follow shortly.

Keywords: head-and-neck oncology, patient involvement, physician-patient relations, shared decision making

Procedia PDF Downloads 89
16893 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms

Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin

Abstract:

This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.

Keywords: machine learning, business models, convex analysis, online learning

Procedia PDF Downloads 136
16892 Mixed Model Sequencing in Painting Production Line

Authors: Unchalee Inkampa, Tuanjai Somboonwiwat

Abstract:

Painting process of automobiles and automobile parts, which is a continuous process based on EDP (Electrode position paint, EDP). Through EDP, all work pieces will be continuously sent to the painting process. Work process can be divided into 2 groups based on the running time: Painting Room 1 and Painting Room 2. This leads to continuous operation. The problem that arises is waiting for workloads onto Painting Room. The grading process EDP to Painting Room is a major problem. Therefore, this paper aim to develop production sequencing method by applying EDP to painting process. It also applied fixed rate launching for painting room and earliest due date (EDD) for EDP process and swap pairwise interchange for waiting time to a minimum of machine. The result found that the developed method could improve painting reduced waiting time, on time delivery, meeting customers wants and improved productivity of painting unit.

Keywords: sequencing, mixed model lines, painting process, electrode position paint

Procedia PDF Downloads 416
16891 Trace Logo: A Notation for Representing Control-Flow of Operational Process

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining research discipline bridges the gap between data mining and business process modeling and analysis, it offers the process-centric and end-to-end methods/techniques for analyzing information of real-world process detailed in operational event-logs. In this paper, we have proposed a notation called trace logo for graphically representing control-flow perspective (order of execution of activities) of process. A trace logo consists of a stack of activity names at each position, sizes of the activity name indicates their frequency in the traces and the total height of the activity depicts the information content of the position. A trace logo created from a set of aligned traces generated using Multiple Trace Alignment technique.

Keywords: consensus trace, process mining, multiple trace alignment, trace logo

Procedia PDF Downloads 343
16890 Network Coding with Buffer Scheme in Multicast for Broadband Wireless Network

Authors: Gunasekaran Raja, Ramkumar Jayaraman, Rajakumar Arul, Kottilingam Kottursamy

Abstract:

Broadband Wireless Network (BWN) is the promising technology nowadays due to the increased number of smartphones. Buffering scheme using network coding considers the reliability and proper degree distribution in Worldwide interoperability for Microwave Access (WiMAX) multi-hop network. Using network coding, a secure way of transmission is performed which helps in improving throughput and reduces the packet loss in the multicast network. At the outset, improved network coding is proposed in multicast wireless mesh network. Considering the problem of performance overhead, degree distribution makes a decision while performing buffer in the encoding / decoding process. Consequently, BuS (Buffer Scheme) based on network coding is proposed in the multi-hop network. Here the encoding process introduces buffer for temporary storage to transmit packets with proper degree distribution. The simulation results depend on the number of packets received in the encoding/decoding with proper degree distribution using buffering scheme.

Keywords: encoding and decoding, buffer, network coding, degree distribution, broadband wireless networks, multicast

Procedia PDF Downloads 398
16889 Feature Extraction and Impact Analysis for Solid Mechanics Using Supervised Finite Element Analysis

Authors: Edward Schwalb, Matthias Dehmer, Michael Schlenkrich, Farzaneh Taslimi, Ketron Mitchell-Wynne, Horen Kuecuekyan

Abstract:

We present a generalized feature extraction approach for supporting Machine Learning (ML) algorithms which perform tasks similar to Finite-Element Analysis (FEA). We report results for estimating the Head Injury Categorization (HIC) of vehicle engine compartments across various impact scenarios. Our experiments demonstrate that models learned using features derived with a simple discretization approach provide a reasonable approximation of a full simulation. We observe that Decision Trees could be as effective as Neural Networks for the HIC task. The simplicity and performance of the learned Decision Trees could offer a trade-off of a multiple order of magnitude increase in speed and cost improvement over full simulation for a reasonable approximation. When used as a complement to full simulation, the approach enables rapid approximate feedback to engineering teams before submission for full analysis. The approach produces mesh independent features and is further agnostic of the assembly structure.

Keywords: mechanical design validation, FEA, supervised decision tree, convolutional neural network.

Procedia PDF Downloads 136
16888 Axiomatic Design of Laser Beam Machining Process

Authors: Nikhil Deshpande, Rahul Mahajan

Abstract:

Laser Beam Machining (LBM) is a non-traditional machining process that has inherent problems like dross, striation, and Heat Affected Zone (HAZ) which reduce the quality of machining. In the present day scenario, these problems are controlled only by iteratively adjusting a large number of process parameters. This paper applies Axiomatic Design principles to design LBM process so as to eliminate the problem of dross and striation and minimize the effect of HAZ. Process parameters and their ranges are proposed to set-up the LBM process, execute the cut and finish the workpiece so as to obtain the best quality cut.

Keywords: laser beam machining, dross, striation, heat affected zone, axiomatic design

Procedia PDF Downloads 364
16887 Process Modeling of Electric Discharge Machining of Inconel 825 Using Artificial Neural Network

Authors: Himanshu Payal, Sachin Maheshwari, Pushpendra S. Bharti

Abstract:

Electrical discharge machining (EDM), a non-conventional machining process, finds wide applications for shaping difficult-to-cut alloys. Process modeling of EDM is required to exploit the process to the fullest. Process modeling of EDM is a challenging task owing to involvement of so many electrical and non-electrical parameters. This work is an attempt to model the EDM process using artificial neural network (ANN). Experiments were carried out on die-sinking EDM taking Inconel 825 as work material. ANN modeling has been performed using experimental data. The prediction ability of trained network has been verified experimentally. Results indicate that ANN can predict the values of performance measures of EDM satisfactorily.

Keywords: artificial neural network, EDM, metal removal rate, modeling, surface roughness

Procedia PDF Downloads 407
16886 Suitable Site Selection of Small Dams Using Geo-Spatial Technique: A Case Study of Dadu Tehsil, Sindh

Authors: Zahid Khalil, Saad Ul Haque, Asif Khan

Abstract:

Decision making about identifying suitable sites for any project by considering different parameters is difficult. Using GIS and Multi-Criteria Analysis (MCA) can make it easy for those projects. This technology has proved to be an efficient and adequate in acquiring the desired information. In this study, GIS and MCA were employed to identify the suitable sites for small dams in Dadu Tehsil, Sindh. The GIS software is used to create all the spatial parameters for the analysis. The parameters that derived are slope, drainage density, rainfall, land use / land cover, soil groups, Curve Number (CN) and runoff index with a spatial resolution of 30m. The data used for deriving above layers include 30-meter resolution SRTM DEM, Landsat 8 imagery, and rainfall from National Centre of Environment Prediction (NCEP) and soil data from World Harmonized Soil Data (WHSD). Land use/Land cover map is derived from Landsat 8 using supervised classification. Slope, drainage network and watershed are delineated by terrain processing of DEM. The Soil Conservation Services (SCS) method is implemented to estimate the surface runoff from the rainfall. Prior to this, SCS-CN grid is developed by integrating the soil and land use/land cover raster. These layers with some technical and ecological constraints are assigned weights on the basis of suitability criteria. The pairwise comparison method, also known as Analytical Hierarchy Process (AHP) is taken into account as MCA for assigning weights on each decision element. All the parameters and group of parameters are integrated using weighted overlay in GIS environment to produce suitable sites for the Dams. The resultant layer is then classified into four classes namely, best suitable, suitable, moderate and less suitable. This study reveals a contribution to decision-making about suitable sites analysis for small dams using geospatial data with minimal amount of ground data. This suitability maps can be helpful for water resource management organizations in determination of feasible rainwater harvesting structures (RWH).

Keywords: Remote sensing, GIS, AHP, RWH

Procedia PDF Downloads 381
16885 Urban Transport Demand Management Multi-Criteria Decision Using AHP and SERVQUAL Models: Case Study of Nigerian Cities

Authors: Suleiman Hassan Otuoze, Dexter Vernon Lloyd Hunt, Ian Jefferson

Abstract:

Urbanization has continued to widen the gap between demand and resources available to provide resilient and sustainable transport services in many fast-growing developing countries' cities. Transport demand management is a decision-based optimization concept for both benchmarking and ensuring efficient use of transport resources. This study assesses the service quality of infrastructure and mobility services in the Nigerian cities of Kano and Lagos through five dimensions of quality (i.e., Tangibility, Reliability, Responsibility, Safety Assurance and Empathy). The methodology adopts a hybrid AHP-SERVQUAL model applied on questionnaire surveys to gauge the quality of satisfaction and the views of experts in the field. The AHP results prioritize tangibility, which defines the state of transportation infrastructure and services in terms of satisfaction qualities and intervention decision weights in the two cities. The results recorded ‘unsatisfactory’ indices of quality of performance and satisfaction rating values of 48% and 49% for Kano and Lagos, respectively. The satisfaction indices are identified as indicators of low performances of transportation demand management (TDM) measures and the necessity to re-order priorities and take proactive steps towards infrastructure. The findings pilot a framework for comparative assessment of recognizable standards in transport services, best ethics of management and a necessity of quality infrastructure to guarantee both resilient and sustainable urban mobility.

Keywords: transportation demand management, multi-criteria decision support, transport infrastructure, service quality, sustainable transport

Procedia PDF Downloads 221
16884 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 137
16883 Programming Systems in Implementation of Process Safety at Chemical Process Industry

Authors: Maryam Shayan

Abstract:

Programming frameworks have been utilized as a part of chemical industry process safety operation and configuration to enhance its effectiveness. This paper gives a brief survey and investigation of the best in class and effects of programming frameworks in process security. A study was completed by talking staff accountable for procedure wellbeing practices in the Iranian chemical process industry and diving into writing of innovation for procedure security. This article investigates the useful and operational attributes of programming frameworks for security and endeavors to sort the product as indicated by its level of effect in the administration chain of importance. The study adds to better comprehension of the parts of Information Communication Technology in procedure security, the future patterns and conceivable gaps for innovative work.

Keywords: programming frameworks, chemical industry process, process security, administration chain, information communication technology

Procedia PDF Downloads 367
16882 Solving Fuzzy Multi-Objective Linear Programming Problems with Fuzzy Decision Variables

Authors: Mahnaz Hosseinzadeh, Aliyeh Kazemi

Abstract:

In this paper, a method is proposed for solving Fuzzy Multi-Objective Linear Programming problems (FMOLPP) with fuzzy right hand side and fuzzy decision variables. To illustrate the proposed method, it is applied to the problem of selecting suppliers for an automotive parts producer company in Iran in order to find the number of optimal orders allocated to each supplier considering the conflicting objectives. Finally, the obtained results are discussed.

Keywords: fuzzy multi-objective linear programming problems, triangular fuzzy numbers, fuzzy ranking, supplier selection problem

Procedia PDF Downloads 377
16881 Do Career Expectancy Beliefs Foster Stability as Well as Mobility in One's Career? A Conceptual Model

Authors: Bishakha Majumdar, Ranjeet Nambudiri

Abstract:

Considerable dichotomy exists in research regarding the role of optimism and self-efficacy in work and career outcomes. Optimism and self-efficacy are related to performance, commitment and engagement, but also are implicated in seeing opportunities outside the firm and switching jobs. There is absence of research capturing these opposing strands of findings in the same model and providing a holistic understanding of how the expectancy beliefs operate in case of the working professional. We attempt to bridge this gap by proposing that career-decision self-efficacy and career outcome expectations affect intention to quit through the competitive mediation pathways of internal and external marketability. This model provides a holistic picture of the role of career expectancy beliefs on career outcomes, by considering perceived career opportunities both inside and outside one’s present organization. The understanding extends the application of career expectancy beliefs in the context of career decision-making by the employed individual. Further, it is valuable for reconsidering the effectiveness of hiring and retention techniques used by a firm, as selection, rewards and training programs need to be supplemented by interventions that specifically strengthen the stability pathway.

Keywords: career decision self-efficacy, career outcome expectations, marketability, intention to quit, job mobility

Procedia PDF Downloads 630
16880 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 633
16879 Media Planning Decisions and Preferences through a Goal Programming Model: An Application to a Media Campaign for a Mature Product in Italy

Authors: Cinzia Colapinto, Davide La Torre

Abstract:

Goal Programming (GP) and its variants were applied to marketing and specific marketing issues, such as media scheduling problems in the last decades. The concept of satisfaction functions has been widely utilized in the GP model to explicitly integrate the Decision-Maker’s preferences. These preferences can be guided by the available information regarding the decision-making situation. A GP model with satisfaction functions for media planning decisions is proposed and then illustrated through a case study related to a marketing/media campaign in the Italian market.

Keywords: goal programming, satisfaction functions, media planning, tourism management

Procedia PDF Downloads 394
16878 A Process to Support Multidisciplinary Teams to Design Serious Games

Authors: Naza Djafarova, Tony Bates, Margaret Verkuyl, Leonora Zefi, Ozgur Turetken, Alex Ferworn, Mastrilli Paula, Daria Romaniuk, Kosha Bramesfeld, Anastasia Dimitriadou, Cheryl To

Abstract:

Designing serious games for education is a challenging and resource-intensive effort. If a well-designed process that balances pedagogical principles with game mechanics is in place, it can help to simplify the design process of serious games and increase efficiency. Multidisciplinary teams involved in designing serious games can benefit tremendously from such a process in their endeavours to develop and implement these games at undergraduate and graduate levels. This paper presentation will outline research results on identified gaps within existing processes and frameworks and present an adapted process that emerged from the research. The research methodology was based on a survey, semi-structured interviews and workshops for testing the adapted process for game design. Based on the findings, the authors propose a simple process for the pre-production stage of serious game design that may help guide multidisciplinary teams in their work. This process was used to facilitate team brainstorming, and is currently being tested to assess if multidisciplinary teams find value in using it in their process of designing serious games.

Keywords: serious game-design, multidisciplinary team, game design framework, learning games, multidisciplinary game design process

Procedia PDF Downloads 421
16877 Groupthink: The Dark Side of Team Cohesion

Authors: Farhad Eizakshiri

Abstract:

The potential for groupthink to explain the issues contributing to deterioration of decision-making ability within the unitary team and so to cause poor outcomes attracted a great deal of attention from a variety of disciplines, including psychology, social and organizational studies, political science, and others. Yet what remains unclear is how and why the team members’ strivings for unanimity and cohesion override their motivation to realistically appraise alternative courses of action. In this paper, the findings of a sequential explanatory mixed-methods research containing an experiment with thirty groups of three persons each and interviews with all experimental groups to investigate this issue is reported. The experiment sought to examine how individuals aggregate their views in order to reach a consensual group decision concerning the completion time of a task. The results indicated that groups made better estimates when they had no interaction between members in comparison with the situation that groups collectively agreed on time estimates. To understand the reasons, the qualitative data and informal observations collected during the task were analyzed through conversation analysis, thus leading to four reasons that caused teams to neglect divergent viewpoints and reduce the number of ideas being considered. Reasons found were the concurrence-seeking tendency, pressure on dissenters, self-censorship, and the illusion of invulnerability. It is suggested that understanding the dynamics behind the aforementioned reasons of groupthink will help project teams to avoid making premature group decisions by enhancing careful evaluation of available information and analysis of available decision alternatives and choices.

Keywords: groupthink, group decision, cohesiveness, project teams, mixed-methods research

Procedia PDF Downloads 394
16876 Back to Basics: Redefining Quality Measurement for Hybrid Software Development Organizations

Authors: Satya Pradhan, Venky Nanniyur

Abstract:

As the software industry transitions from a license-based model to a subscription-based Software-as-a-Service (SaaS) model, many software development groups are using a hybrid development model that incorporates Agile and Waterfall methodologies in different parts of the organization. The traditional metrics used for measuring software quality in Waterfall or Agile paradigms do not apply to this new hybrid methodology. In addition, to respond to higher quality demands from customers and to gain a competitive advantage in the market, many companies are starting to prioritize quality as a strategic differentiator. As a result, quality metrics are included in the decision-making activities all the way up to the executive level, including board of director reviews. This paper presents key challenges associated with measuring software quality in organizations using the hybrid development model. We introduce a framework called Prevention-Inspection-Evaluation-Removal (PIER) to provide a comprehensive metric definition for hybrid organizations. The framework includes quality measurements, quality enforcement, and quality decision points at different organizational levels and project milestones. The metrics framework defined in this paper is being used for all Cisco systems products used in customer premises. We present several field metrics for one product portfolio (enterprise networking) to show the effectiveness of the proposed measurement system. As the results show, this metrics framework has significantly improved in-process defect management as well as field quality.

Keywords: quality management system, quality metrics framework, quality metrics, agile, waterfall, hybrid development system

Procedia PDF Downloads 169
16875 Evidence Theory Based Emergency Multi-Attribute Group Decision-Making: Application in Facility Location Problem

Authors: Bidzina Matsaberidze

Abstract:

It is known that, in emergency situations, multi-attribute group decision-making (MAGDM) models are characterized by insufficient objective data and a lack of time to respond to the task. Evidence theory is an effective tool for describing such incomplete information in decision-making models when the expert and his knowledge are involved in the estimations of the MAGDM parameters. We consider an emergency decision-making model, where expert assessments on humanitarian aid from distribution centers (HADC) are represented in q-rung ortho-pair fuzzy numbers, and the data structure is described within the data body theory. Based on focal probability construction and experts’ evaluations, an objective function-distribution centers’ selection ranking index is constructed. Our approach for solving the constructed bicriteria partitioning problem consists of two phases. In the first phase, based on the covering’s matrix, we generate a matrix, the columns of which allow us to find all possible partitionings of the HADCs with the service centers. Some constraints are also taken into consideration while generating the matrix. In the second phase, based on the matrix and using our exact algorithm, we find the partitionings -allocations of the HADCs to the centers- which correspond to the Pareto-optimal solutions. For an illustration of the obtained results, a numerical example is given for the facility location-selection problem.

Keywords: emergency MAGDM, q-rung orthopair fuzzy sets, evidence theory, HADC, facility location problem, multi-objective combinatorial optimization problem, Pareto-optimal solutions

Procedia PDF Downloads 86
16874 Operational Advantages of Tungsten Inert Gas over Metal Inert Gas Welding Process

Authors: Emmanuel Ogundimu, Esther Akinlabi, Mutiu Erinosho

Abstract:

In this research, studies were done on the material characterization of type 304 austenitic stainless steel weld produced by TIG (Tungsten Inert Gas) and MIG (Metal Inert Gas) welding processes. This research is aimed to establish optimized process parameters that will result in a defect-free weld joint, homogenous distribution of the iron (Fe), chromium (Cr) and nickel (Ni) was observed at the welded joint of all the six samples. The welded sample produced at the current of 170 A by TIG welding process had the highest ultimate tensile strength (UTS) value of 621 MPa at the welds zone, and the welded sample produced by MIG process at the welding current of 150 A had the lowest UTS value of 568 MPa. However, it was established that TIG welding process is more appropriate for the welding of type 304 austenitic stainless steel compared to the MIG welding process.

Keywords: microhardness, microstructure, tensile, MIG welding, process, tensile, shear stress TIG welding, TIG-MIG welding

Procedia PDF Downloads 185
16873 A Machine Learning Decision Support Framework for Industrial Engineering Purposes

Authors: Anli Du Preez, James Bekker

Abstract:

Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.

Keywords: Data analytics, Industrial engineering, Machine learning, Value creation

Procedia PDF Downloads 165
16872 Modelling Water Usage for Farming

Authors: Ozgu Turgut

Abstract:

Water scarcity is a problem for many regions which requires immediate action, and solutions cannot be postponed for a long time. It is known that farming consumes a significant portion of usable water. Although in recent years, the efforts to make the transition to dripping or spring watering systems instead of using surface watering started to pay off. It is also known that this transition is not necessarily translated into an increase in the capacity dedicated to other water consumption channels such as city water or power usage. In order to control and allocate the water resource more purposefully, new watering systems have to be used with monitoring abilities that can limit the usage capacity for each farm. In this study, a decision support model which relies on a bi-objective stochastic linear optimization is proposed, which takes crop yield and price volatility into account. The model generates annual planting plans as well as water usage limits for each farmer in the region while taking the total value (i.e., profit) of the overall harvest. The mathematical model is solved using the L-shaped method optimally. The decision support model can be especially useful for regional administrations to plan next year's planting and water incomes and expenses. That is why not only a single optimum but also a set of representative solutions from the Pareto set is generated with the proposed approach.

Keywords: decision support, farming, water, tactical planning, optimization, stochastic, pareto

Procedia PDF Downloads 67
16871 Epileptic Seizure Onset Detection via Energy and Neural Synchronization Decision Fusion

Authors: Marwa Qaraqe, Muhammad Ismail, Erchin Serpedin

Abstract:

This paper presents a novel architecture for a patient-specific epileptic seizure onset detector using scalp electroencephalography (EEG). The proposed architecture is based on the decision fusion calculated from energy and neural synchronization related features. Specifically, one level of the detector calculates the condition number (CN) of an EEG matrix to evaluate the amount of neural synchronization present within the EEG channels. On a parallel level, the detector evaluates the energy contained in four EEG frequency subbands. The information is then fed into two independent (parallel) classification units based on support vector machines to determine the onset of a seizure event. The decisions from the two classifiers are then combined together according to two fusion techniques to determine a global decision. Experimental results demonstrate that the detector based on the AND fusion technique outperforms existing detectors with a sensitivity of 100%, detection latency of 3 seconds, while it achieves a 2:76 false alarm rate per hour. The OR fusion technique achieves a sensitivity of 100%, and significantly improves delay latency (0:17 seconds), yet it achieves 12 false alarms per hour.

Keywords: epilepsy, EEG, seizure onset, electroencephalography, neuron, detection

Procedia PDF Downloads 471
16870 Re-Engineering Management Process in IRAN’s Smart Schools

Authors: M. R. Babaei, S. M. Hosseini, S. Rahmani, L. Moradi

Abstract:

Today, the quality of education and training systems and the effectiveness of the education systems of most concern to stakeholders and decision-makers of our country's development in each country. In Iran this is a double issue of concern to numerous reasons; So that governments, over the past decade have hardly even paid the running costs of education. ICT is claiming it has the power to change the structure of a program for training, reduce costs and increase quality, and do education systems and products consistent with the needs of the community and take steps to practice education. Own of the areas that the introduction of information technology has fundamentally changed is the field of education. The aim of this research is process reengineering management in schools simultaneously has been using field studies to collect data in the form of interviews and a questionnaire survey. The statistical community of this research has been the country of Iran and smart schools under the education. Sampling was targeted. The data collection tool was a questionnaire composed of two parts. The questionnaire consists of 36 questions that each question designates one of effective factors on the management of smart schools. Also each question consists of two parts. The first part designates the operating position in the management process, which represents the domain's belonging to the management agent (planning, organizing, leading, controlling). According to the classification of Dabryn and in second part the factors affect the process of managing the smart schools were examined, that Likert scale is used to classify. Questions the validity of the group of experts and prominent university professors in the fields of information technology, management and reengineering of approved and Cronbach's alpha reliability and also with the use of the formula is evaluated and approved. To analyse the data, descriptive and inferential statistics were used to analyse the factors contributing to the rating of (Linkert scale) descriptive statistics (frequency table data, mean, median, mode) was used. To analyse the data using analysis of variance and nonparametric tests and Friedman test, the assumption was evaluated. The research conclusions show that the factors influencing the management process re-engineering smart schools in school performance is affected.

Keywords: re-engineering, management process, smart school, Iran's school

Procedia PDF Downloads 240
16869 Reduction of Energy Consumption of Distillation Process by Recovering the Heat from Exit Streams

Authors: Apichit Svang-Ariyaskul, Thanapat Chaireongsirikul, Pawit Tangviroon

Abstract:

Distillation consumes enormous quantity of energy. This work proposed a process to recover the energy from exit streams during the distillation process of three consecutive columns. There are several novel techniques to recover the heat with the distillation system; however, a complex control system is required. This work proposed a simpler technique by exchanging the heat between streams without interrupting the internal distillation process that might cause a serious control problem. The proposed process is executed by using heat exchanger network with pinch analysis to maximize the process heat recovery. The test model is the distillation of butane, pentane, hexane, and heptanes, which is a common mixture in the petroleum refinery. This proposed process saved the energy consumption for hot and cold utilities of 29 and 27%, which is considered significant. Therefore, the recovery of heat from exit streams from distillation process is proved to be effective for energy saving.

Keywords: distillation, heat exchanger, network pinch analysis, chemical engineering

Procedia PDF Downloads 360
16868 Rewilding the River: Assessing the Environmental Effects and Regulatory Influences of the Condit Dam Removal Process

Authors: Neda Safari, Jacob Petersen-Perlman

Abstract:

There are more than two million dams in the United States, and a considerable portion of them are either non-operational or approaching the end of their designed lifespan. However, this emerging trend is new, and the majority of dam sites have not undergone thorough research and assessments after their removal to determine the overall effectiveness of restoration initiatives, particularly in the case of large-scale dams that may significantly impact their surrounding areas. A crucial factor to consider is the lack of specific regulations pertaining to dam removal at the federal level. Consequently, other environmental regulations that were not originally designed with dam removal considerations are used to execute these projects. This can result in delays or challenges for dam removal initiatives. The process of removing dams is usually the most important first step to restore the ecological and biological health of the river, but often there is a lack of measurable indicators to assess if it has achieved its intended objectives. In addition, the majority of studies on dam removal are only short-term and focus on a particular measure of response. Therefore, it is essential to conduct extensive and continuous monitoring to analyze the river's response throughout every aspect. Our study is divided into two sections. The first section of my research will analyze the establishment and utilization of dam removal laws and regulations in the Condit Dam removal process. We will highlight the areas where the frameworks for policy and dam removal projects remain in need of improvement in order to facilitate successful dam removals in the future. In this part, We will review the policies and plans that affected the decision-making process to remove the Condit dam while also looking at how they impacted the physical changes to the river after the dam was removed. In the second section, we will look at the effects of the dam removal over a decade later and attempt to determine how the river's physical response has been impacted by this modification. Our study aims to investigate the Condit dam removal process and its impact on the ecological response of the river. We anticipate identifying areas for improvement in policies pertaining to dam removal projects and exploring ways to enhance them to ensure improved project outcomes in the future.

Keywords: dam removal, ecolocgical change, water related regulation, water resources

Procedia PDF Downloads 39
16867 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 81
16866 Dimensioning of Circuit Switched Networks by Using Simulation Code Based On Erlang (B) Formula

Authors: Ali Mustafa Elshawesh, Mohamed Abdulali

Abstract:

The paper presents an approach to dimension circuit switched networks and find the relationship between the parameters of the circuit switched networks on the condition of specific probability of call blocking. Our work is creating a Simulation code based on Erlang (B) formula to draw graphs which show two curves for each graph; one of simulation and the other of calculated. These curves represent the relationships between average number of calls and average call duration with the probability of call blocking. This simulation code facilitates to select the appropriate parameters for circuit switched networks.

Keywords: Erlang B formula, call blocking, telephone system dimension, Markov model, link capacity

Procedia PDF Downloads 605
16865 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 140