Search results for: Gaussian process priors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15221

Search results for: Gaussian process priors

15041 Laboratory Investigation of Alkali-Surfactant-Alternate Gas (ASAG) Injection – a Novel EOR Process for a Light Oil Sandstone Reservoir

Authors: Vidit Mohan, Ashwin P. Ramesh, Anirudh Toshniwal

Abstract:

Alkali-Surfactant-Alternate-Gas(ASAG) injection, a novel EOR process has the potential to improve displacement efficiency over Surfactant-Alternate-Gas(SAG) by addressing the problem of surfactant adsorption by clay minerals in rock matrix. A detailed laboratory investigation on ASAG injection process was carried out with encouraging results. To further enhance recovery over WAG injection process, SAG injection was investigated at laboratory scale. SAG injection yielded marginal incremental displacement efficiency over WAG process. On investigation, it was found that, clay minerals in rock matrix adsorbed the surfactants and were detrimental for SAG process. Hence, ASAG injection was conceptualized using alkali as a clay stabilizer. The experiment of ASAG injection with surfactant concentration of 5000 ppm and alkali concentration of 0.5 weight% yields incremental displacement efficiency of 5.42% over WAG process. The ASAG injection is a new process and has potential to enhance efficiency of WAG/SAG injection process.

Keywords: alkali surfactant alternate gas (ASAG), surfactant alternate gas (SAG), laboratory investigation, EOR process

Procedia PDF Downloads 459
15040 An Evaluation on the Methodology of Manufacturing High Performance Organophilic Clay at the Most Efficient and Cost Effective Process

Authors: Siti Nur Izati Azmi, Zatil Afifah Omar, Kathi Swaran, Navin Kumar

Abstract:

Organophilic Clays, also known as Organoclays, is used as a viscosifier in Oil based Drilling fluids. Most often, Organophilic clay are produced from modified Sodium and Calcium based Bentonite. Many studies and data show that Organophilic Clay using Hectorite based clays provide the best yield and good fluid loss properties in an oil-based drilling fluid at a higher cost. In terms of the manufacturing process, the two common methods of manufacturing organophilic clays are a Wet Process and a Dry Process. Wet process is known to produce better performance product at a higher cost while Dry Process shorten the production time. Hence, the purpose of this study is to evaluate the various formulation of an organophilic clay and its performance vs. the cost, as well as to determine the most efficient and cost-effective method of manufacturing organophilic clays.

Keywords: organophilic clay, viscosifier, wet process, dry process

Procedia PDF Downloads 210
15039 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit

Authors: Davit Mirzoyan, Ararat Khachatryan

Abstract:

A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.

Keywords: detection, monitoring, process corner, process variation

Procedia PDF Downloads 511
15038 Comprehensive Assessment of Energy Efficiency within the Production Process

Authors: S. Kreitlein, N. Eder, J. Franke

Abstract:

The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.

Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production

Procedia PDF Downloads 717
15037 Early Diagnosis of Myocardial Ischemia Based on Support Vector Machine and Gaussian Mixture Model by Using Features of ECG Recordings

Authors: Merve Begum Terzi, Orhan Arikan, Adnan Abaci, Mustafa Candemir

Abstract:

Acute myocardial infarction is a major cause of death in the world. Therefore, its fast and reliable diagnosis is a major clinical need. ECG is the most important diagnostic methodology which is used to make decisions about the management of the cardiovascular diseases. In patients with acute myocardial ischemia, temporary chest pains together with changes in ST segment and T wave of ECG occur shortly before the start of myocardial infarction. In this study, a technique which detects changes in ST/T sections of ECG is developed for the early diagnosis of acute myocardial ischemia. For this purpose, a database of real ECG recordings that contains a set of records from 75 patients presenting symptoms of chest pain who underwent elective percutaneous coronary intervention (PCI) is constituted. 12-lead ECG’s of the patients were recorded before and during the PCI procedure. Two ECG epochs, which are the pre-inflation ECG which is acquired before any catheter insertion and the occlusion ECG which is acquired during balloon inflation, are analyzed for each patient. By using pre-inflation and occlusion recordings, ECG features that are critical in the detection of acute myocardial ischemia are identified and the most discriminative features for the detection of acute myocardial ischemia are extracted. A classification technique based on support vector machine (SVM) approach operating with linear and radial basis function (RBF) kernels to detect ischemic events by using ST-T derived joint features from non-ischemic and ischemic states of the patients is developed. The dataset is randomly divided into training and testing sets and the training set is used to optimize SVM hyperparameters by using grid-search method and 10fold cross-validation. SVMs are designed specifically for each patient by tuning the kernel parameters in order to obtain the optimal classification performance results. As a result of implementing the developed classification technique to real ECG recordings, it is shown that the proposed technique provides highly reliable detections of the anomalies in ECG signals. Furthermore, to develop a detection technique that can be used in the absence of ECG recording obtained during healthy stage, the detection of acute myocardial ischemia based on ECG recordings of the patients obtained during ischemia is also investigated. For this purpose, a Gaussian mixture model (GMM) is used to represent the joint pdf of the most discriminating ECG features of myocardial ischemia. Then, a Neyman-Pearson type of approach is developed to provide detection of outliers that would correspond to acute myocardial ischemia. Neyman – Pearson decision strategy is used by computing the average log likelihood values of ECG segments and comparing them with a range of different threshold values. For different discrimination threshold values and number of ECG segments, probability of detection and probability of false alarm values are computed, and the corresponding ROC curves are obtained. The results indicate that increasing number of ECG segments provide higher performance for GMM based classification. Moreover, the comparison between the performances of SVM and GMM based classification showed that SVM provides higher classification performance results over ECG recordings of considerable number of patients.

Keywords: ECG classification, Gaussian mixture model, Neyman–Pearson approach, support vector machine

Procedia PDF Downloads 150
15036 Towards Incorporating Context Awareness into Business Process Management

Authors: Xiaohui Zhao, Shahan Mafuz

Abstract:

Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.

Keywords: business process adaptation, business process evolution, business process modelling, and context awareness

Procedia PDF Downloads 402
15035 Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management

Authors: Marcos Devaner, Marcela Alves, Cledson Braga, Fabiano Alves, Wilton Bezerra

Abstract:

This article discusses the inclusion of people with disabilities in the process of testing an accessible system solution for distance education. The accessible system, team profile, methodologies and techniques covered in the testing process are presented. The testing process shown in this paper was designed from the experience with user. The testing process emerged from lessons learned from past experiences and the end user is present at all stages of the tests. Also, lessons learned are reported and how it was possible the maturing of the team and the methods resulting in a simple, productive and effective process.

Keywords: experience report, accessible systems, software testing, testing process, systems, e-learning

Procedia PDF Downloads 379
15034 Bayesian Flexibility Modelling of the Conditional Autoregressive Prior in a Disease Mapping Model

Authors: Davies Obaromi, Qin Yongsong, James Ndege, Azeez Adeboye, Akinwumi Odeyemi

Abstract:

The basic model usually used in disease mapping, is the Besag, York and Mollie (BYM) model and which combines the spatially structured and spatially unstructured priors as random effects. Bayesian Conditional Autoregressive (CAR) model is a disease mapping method that is commonly used for smoothening the relative risk of any disease as used in the Besag, York and Mollie (BYM) model. This model (CAR), which is also usually assigned as a prior to one of the spatial random effects in the BYM model, successfully uses information from adjacent sites to improve estimates for individual sites. To our knowledge, there are some unrealistic or counter-intuitive consequences on the posterior covariance matrix of the CAR prior for the spatial random effects. In the conventional BYM (Besag, York and Mollie) model, the spatially structured and the unstructured random components cannot be seen independently, and which challenges the prior definitions for the hyperparameters of the two random effects. Therefore, the main objective of this study is to construct and utilize an extended Bayesian spatial CAR model for studying tuberculosis patterns in the Eastern Cape Province of South Africa, and then compare for flexibility with some existing CAR models. The results of the study revealed the flexibility and robustness of this alternative extended CAR to the commonly used CAR models by comparison, using the deviance information criteria. The extended Bayesian spatial CAR model is proved to be a useful and robust tool for disease modeling and as a prior for the structured spatial random effects because of the inclusion of an extra hyperparameter.

Keywords: Besag2, CAR models, disease mapping, INLA, spatial models

Procedia PDF Downloads 266
15033 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities

Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun

Abstract:

The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.

Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids

Procedia PDF Downloads 49
15032 Development of new Ecological Cleaning Process of Metal Sheets

Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario

Abstract:

In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.

Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.

Procedia PDF Downloads 340
15031 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion

Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You

Abstract:

For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.

Keywords: case-based reasoning, internal thread, cold extrusion, process planning

Procedia PDF Downloads 495
15030 Concept Drifts Detection and Localisation in Process Mining

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.

Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining

Procedia PDF Downloads 333
15029 Mixed Model Sequencing in Painting Production Line

Authors: Unchalee Inkampa, Tuanjai Somboonwiwat

Abstract:

Painting process of automobiles and automobile parts, which is a continuous process based on EDP (Electrode position paint, EDP). Through EDP, all work pieces will be continuously sent to the painting process. Work process can be divided into 2 groups based on the running time: Painting Room 1 and Painting Room 2. This leads to continuous operation. The problem that arises is waiting for workloads onto Painting Room. The grading process EDP to Painting Room is a major problem. Therefore, this paper aim to develop production sequencing method by applying EDP to painting process. It also applied fixed rate launching for painting room and earliest due date (EDD) for EDP process and swap pairwise interchange for waiting time to a minimum of machine. The result found that the developed method could improve painting reduced waiting time, on time delivery, meeting customers wants and improved productivity of painting unit.

Keywords: sequencing, mixed model lines, painting process, electrode position paint

Procedia PDF Downloads 408
15028 Trace Logo: A Notation for Representing Control-Flow of Operational Process

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining research discipline bridges the gap between data mining and business process modeling and analysis, it offers the process-centric and end-to-end methods/techniques for analyzing information of real-world process detailed in operational event-logs. In this paper, we have proposed a notation called trace logo for graphically representing control-flow perspective (order of execution of activities) of process. A trace logo consists of a stack of activity names at each position, sizes of the activity name indicates their frequency in the traces and the total height of the activity depicts the information content of the position. A trace logo created from a set of aligned traces generated using Multiple Trace Alignment technique.

Keywords: consensus trace, process mining, multiple trace alignment, trace logo

Procedia PDF Downloads 338
15027 Axiomatic Design of Laser Beam Machining Process

Authors: Nikhil Deshpande, Rahul Mahajan

Abstract:

Laser Beam Machining (LBM) is a non-traditional machining process that has inherent problems like dross, striation, and Heat Affected Zone (HAZ) which reduce the quality of machining. In the present day scenario, these problems are controlled only by iteratively adjusting a large number of process parameters. This paper applies Axiomatic Design principles to design LBM process so as to eliminate the problem of dross and striation and minimize the effect of HAZ. Process parameters and their ranges are proposed to set-up the LBM process, execute the cut and finish the workpiece so as to obtain the best quality cut.

Keywords: laser beam machining, dross, striation, heat affected zone, axiomatic design

Procedia PDF Downloads 359
15026 Process Modeling of Electric Discharge Machining of Inconel 825 Using Artificial Neural Network

Authors: Himanshu Payal, Sachin Maheshwari, Pushpendra S. Bharti

Abstract:

Electrical discharge machining (EDM), a non-conventional machining process, finds wide applications for shaping difficult-to-cut alloys. Process modeling of EDM is required to exploit the process to the fullest. Process modeling of EDM is a challenging task owing to involvement of so many electrical and non-electrical parameters. This work is an attempt to model the EDM process using artificial neural network (ANN). Experiments were carried out on die-sinking EDM taking Inconel 825 as work material. ANN modeling has been performed using experimental data. The prediction ability of trained network has been verified experimentally. Results indicate that ANN can predict the values of performance measures of EDM satisfactorily.

Keywords: artificial neural network, EDM, metal removal rate, modeling, surface roughness

Procedia PDF Downloads 398
15025 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 439
15024 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia

Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui

Abstract:

This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.

Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia

Procedia PDF Downloads 387
15023 Programming Systems in Implementation of Process Safety at Chemical Process Industry

Authors: Maryam Shayan

Abstract:

Programming frameworks have been utilized as a part of chemical industry process safety operation and configuration to enhance its effectiveness. This paper gives a brief survey and investigation of the best in class and effects of programming frameworks in process security. A study was completed by talking staff accountable for procedure wellbeing practices in the Iranian chemical process industry and diving into writing of innovation for procedure security. This article investigates the useful and operational attributes of programming frameworks for security and endeavors to sort the product as indicated by its level of effect in the administration chain of importance. The study adds to better comprehension of the parts of Information Communication Technology in procedure security, the future patterns and conceivable gaps for innovative work.

Keywords: programming frameworks, chemical industry process, process security, administration chain, information communication technology

Procedia PDF Downloads 359
15022 BIASS in the Estimation of Covariance Matrices and Optimality Criteria

Authors: Juan M. Rodriguez-Diaz

Abstract:

The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.

Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix

Procedia PDF Downloads 427
15021 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 624
15020 A Process to Support Multidisciplinary Teams to Design Serious Games

Authors: Naza Djafarova, Tony Bates, Margaret Verkuyl, Leonora Zefi, Ozgur Turetken, Alex Ferworn, Mastrilli Paula, Daria Romaniuk, Kosha Bramesfeld, Anastasia Dimitriadou, Cheryl To

Abstract:

Designing serious games for education is a challenging and resource-intensive effort. If a well-designed process that balances pedagogical principles with game mechanics is in place, it can help to simplify the design process of serious games and increase efficiency. Multidisciplinary teams involved in designing serious games can benefit tremendously from such a process in their endeavours to develop and implement these games at undergraduate and graduate levels. This paper presentation will outline research results on identified gaps within existing processes and frameworks and present an adapted process that emerged from the research. The research methodology was based on a survey, semi-structured interviews and workshops for testing the adapted process for game design. Based on the findings, the authors propose a simple process for the pre-production stage of serious game design that may help guide multidisciplinary teams in their work. This process was used to facilitate team brainstorming, and is currently being tested to assess if multidisciplinary teams find value in using it in their process of designing serious games.

Keywords: serious game-design, multidisciplinary team, game design framework, learning games, multidisciplinary game design process

Procedia PDF Downloads 416
15019 The Bayesian Premium Under Entropy Loss

Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita

Abstract:

Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.

Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation

Procedia PDF Downloads 319
15018 Operational Advantages of Tungsten Inert Gas over Metal Inert Gas Welding Process

Authors: Emmanuel Ogundimu, Esther Akinlabi, Mutiu Erinosho

Abstract:

In this research, studies were done on the material characterization of type 304 austenitic stainless steel weld produced by TIG (Tungsten Inert Gas) and MIG (Metal Inert Gas) welding processes. This research is aimed to establish optimized process parameters that will result in a defect-free weld joint, homogenous distribution of the iron (Fe), chromium (Cr) and nickel (Ni) was observed at the welded joint of all the six samples. The welded sample produced at the current of 170 A by TIG welding process had the highest ultimate tensile strength (UTS) value of 621 MPa at the welds zone, and the welded sample produced by MIG process at the welding current of 150 A had the lowest UTS value of 568 MPa. However, it was established that TIG welding process is more appropriate for the welding of type 304 austenitic stainless steel compared to the MIG welding process.

Keywords: microhardness, microstructure, tensile, MIG welding, process, tensile, shear stress TIG welding, TIG-MIG welding

Procedia PDF Downloads 178
15017 Simulation of Laser Structuring by Three Dimensional Heat Transfer Model

Authors: Bassim Shaheen Bachy, Jörg Franke

Abstract:

In this study, a three dimensional numerical heat transfer model has been used to simulate the laser structuring of polymer substrate material in the Three-Dimensional Molded Interconnect Device (3D MID) which is used in the advanced multi-functional applications. A finite element method (FEM) transient thermal analysis is performed using APDL (ANSYS Parametric Design Language) provided by ANSYS. In this model, the effect of surface heat source was modeled with Gaussian distribution, also the effect of the mixed boundary conditions which consist of convection and radiation heat transfers have been considered in this analysis. The model provides a full description of the temperature distribution, as well as calculates the depth and the width of the groove upon material removal at different set of laser parameters such as laser power and laser speed. This study also includes the experimental procedure to study the effect of laser parameters on the depth and width of the removal groove metal as verification to the modeled results. Good agreement between the experimental and the model results is achieved for a wide range of laser powers. It is found that the quality of the laser structure process is affected by the laser scan speed and laser power. For a high laser structured quality, it is suggested to use laser with high speed and moderate to high laser power.

Keywords: laser structuring, simulation, finite element analysis, thermal modeling

Procedia PDF Downloads 335
15016 Reduction of Energy Consumption of Distillation Process by Recovering the Heat from Exit Streams

Authors: Apichit Svang-Ariyaskul, Thanapat Chaireongsirikul, Pawit Tangviroon

Abstract:

Distillation consumes enormous quantity of energy. This work proposed a process to recover the energy from exit streams during the distillation process of three consecutive columns. There are several novel techniques to recover the heat with the distillation system; however, a complex control system is required. This work proposed a simpler technique by exchanging the heat between streams without interrupting the internal distillation process that might cause a serious control problem. The proposed process is executed by using heat exchanger network with pinch analysis to maximize the process heat recovery. The test model is the distillation of butane, pentane, hexane, and heptanes, which is a common mixture in the petroleum refinery. This proposed process saved the energy consumption for hot and cold utilities of 29 and 27%, which is considered significant. Therefore, the recovery of heat from exit streams from distillation process is proved to be effective for energy saving.

Keywords: distillation, heat exchanger, network pinch analysis, chemical engineering

Procedia PDF Downloads 352
15015 BER Estimate of WCDMA Systems with MATLAB Simulation Model

Authors: Suyeb Ahmed Khan, Mahmood Mian

Abstract:

Simulation plays an important role during all phases of the design and engineering of communications systems, from early stages of conceptual design through the various stages of implementation, testing, and fielding of the system. In the present paper, a simulation model has been constructed for the WCDMA system in order to evaluate the performance. This model describes multiusers effects and calculation of BER (Bit Error Rate) in 3G mobile systems using Simulink MATLAB 7.1. Gaussian Approximation defines the multi-user effect on system performance. BER has been analyzed with comparison between transmitting data and receiving data.

Keywords: WCDMA, simulations, BER, MATLAB

Procedia PDF Downloads 576
15014 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios

Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu

Abstract:

Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.

Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method

Procedia PDF Downloads 151
15013 Improvement of Process Competitiveness Using Intelligent Reference Models

Authors: Julio Macedo

Abstract:

Several methodologies are now available to conceive the improvements of a process so that it becomes competitive as for example total quality, process reengineering, six sigma, define measure analysis improvement control method. These improvements are of different nature and can be external to the process represented by an optimization model or a discrete simulation model. In addition, the process stakeholders are several and have different desired performances for the process. Hence, the methodologies above do not have a tool to aid in the conception of the required improvements. In order to fill this void we suggest the use of intelligent reference models. A reference model is a set of qualitative differential equations and an objective function that minimizes the gap between the current and the desired performance indexes of the process. The reference models are intelligent so when they receive the current state of the problematic process and the desired performance indexes they generate the required improvements for the problematic process. The reference models are fuzzy cognitive maps added with an objective function and trained using the improvements implemented by the high performance firms. Experiments done in a set of students show the reference models allow them to conceive more improvements than students that do not use these models.

Keywords: continuous improvement, fuzzy cognitive maps, process competitiveness, qualitative simulation, system dynamics

Procedia PDF Downloads 75
15012 Quantification of Dispersion Effects in Arterial Spin Labelling Perfusion MRI

Authors: Rutej R. Mehta, Michael A. Chappell

Abstract:

Introduction: Arterial spin labelling (ASL) is an increasingly popular perfusion MRI technique, in which arterial blood water is magnetically labelled in the neck before flowing into the brain, providing a non-invasive measure of cerebral blood flow (CBF). The accuracy of ASL CBF measurements, however, is hampered by dispersion effects; the distortion of the ASL labelled bolus during its transit through the vasculature. In spite of this, the current recommended implementation of ASL – the white paper (Alsop et al., MRM, 73.1 (2015): 102-116) – does not account for dispersion, which leads to the introduction of errors in CBF. Given that the transport time from the labelling region to the tissue – the arterial transit time (ATT) – depends on the region of the brain and the condition of the patient, it is likely that these errors will also vary with the ATT. In this study, various dispersion models are assessed in comparison with the white paper (WP) formula for CBF quantification, enabling the errors introduced by the WP to be quantified. Additionally, this study examines the relationship between the errors associated with the WP and the ATT – and how this is influenced by dispersion. Methods: Data were simulated using the standard model for pseudo-continuous ASL, along with various dispersion models, and then quantified using the formula in the WP. The ATT was varied from 0.5s-1.3s, and the errors associated with noise artefacts were computed in order to define the concept of significant error. The instantaneous slope of the error was also computed as an indicator of the sensitivity of the error with fluctuations in ATT. Finally, a regression analysis was performed to obtain the mean error against ATT. Results: An error of 20.9% was found to be comparable to that introduced by typical measurement noise. The WP formula was shown to introduce errors exceeding 20.9% for ATTs beyond 1.25s even when dispersion effects were ignored. Using a Gaussian dispersion model, a mean error of 16% was introduced by using the WP, and a dispersion threshold of σ=0.6 was determined, beyond which the error was found to increase considerably with ATT. The mean error ranged from 44.5% to 73.5% when other physiologically plausible dispersion models were implemented, and the instantaneous slope varied from 35 to 75 as dispersion levels were varied. Conclusion: It has been shown that the WP quantification formula holds only within an ATT window of 0.5 to 1.25s, and that this window gets narrower as dispersion occurs. Provided that the dispersion levels fall below the threshold evaluated in this study, however, the WP can measure CBF with reasonable accuracy if dispersion is correctly modelled by the Gaussian model. However, substantial errors were observed with other common models for dispersion with dispersion levels similar to those that have been observed in literature.

Keywords: arterial spin labelling, dispersion, MRI, perfusion

Procedia PDF Downloads 361