Search results for: Gaussian process priors
15268 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities
Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun
Abstract:
The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids
Procedia PDF Downloads 6415267 Block Mining: Block Chain Enabled Process Mining Database
Authors: James Newman
Abstract:
Process mining is an emerging technology that looks to serialize enterprise data in time series data. It has been used by many companies and has been the subject of a variety of research papers. However, the majority of current efforts have looked at how to best create process mining from standard relational databases. This paper is the first pass at outlining a database custom-built for the minimal viable product of process mining. We present Block Miner, a blockchain protocol to store process mining data across a distributed network. We demonstrate the feasibility of storing process mining data on the blockchain. We present a proof of concept and show how the intersection of these two technologies helps to solve a variety of issues, including but not limited to ransomware attacks, tax documentation, and conflict resolution.Keywords: blockchain, process mining, memory optimization, protocol
Procedia PDF Downloads 10215266 Data-Mining Approach to Analyzing Industrial Process Information for Real-Time Monitoring
Authors: Seung-Lock Seo
Abstract:
This work presents a data-mining empirical monitoring scheme for industrial processes with partially unbalanced data. Measurement data of good operations are relatively easy to gather, but in unusual special events or faults it is generally difficult to collect process information or almost impossible to analyze some noisy data of industrial processes. At this time some noise filtering techniques can be used to enhance process monitoring performance in a real-time basis. In addition, pre-processing of raw process data is helpful to eliminate unwanted variation of industrial process data. In this work, the performance of various monitoring schemes was tested and demonstrated for discrete batch process data. It showed that the monitoring performance was improved significantly in terms of monitoring success rate of given process faults.Keywords: data mining, process data, monitoring, safety, industrial processes
Procedia PDF Downloads 40015265 Liver and Liver Lesion Segmentation From Abdominal CT Scans
Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid
Abstract:
The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithmKeywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm
Procedia PDF Downloads 45115264 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia
Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui
Abstract:
This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia
Procedia PDF Downloads 39715263 Multivariate Statistical Process Monitoring of Base Metal Flotation Plant Using Dissimilarity Scale-Based Singular Spectrum Analysis
Authors: Syamala Krishnannair
Abstract:
A multivariate statistical process monitoring methodology using dissimilarity scale-based singular spectrum analysis (SSA) is proposed for the detection and diagnosis of process faults in the base metal flotation plant. Process faults are detected based on the multi-level decomposition of process signals by SSA using the dissimilarity structure of the process data and the subsequent monitoring of the multiscale signals using the unified monitoring index which combines T² with SPE. Contribution plots are used to identify the root causes of the process faults. The overall results indicated that the proposed technique outperformed the conventional multivariate techniques in the detection and diagnosis of the process faults in the flotation plant.Keywords: fault detection, fault diagnosis, process monitoring, dissimilarity scale
Procedia PDF Downloads 20915262 Bridging the Gap between Different Interfaces for Business Process Modeling
Authors: Katalina Grigorova, Kaloyan Mironov
Abstract:
The paper focuses on the benefits of business process modeling. Although this discipline is developing for many years, there is still necessity of creating new opportunities to meet the ever-increasing users’ needs. Because one of these needs is related to the conversion of business process models from one standard to another, the authors have developed a converter between BPMN and EPC standards using workflow patterns as intermediate tool. Nowadays there are too many systems for business process modeling. The variety of output formats is almost the same as the systems themselves. This diversity additionally hampers the conversion of the models. The presented study is aimed at discussing problems due to differences in the output formats of various modeling environments.Keywords: business process modeling, business process modeling standards, workflow patterns, converting models
Procedia PDF Downloads 58415261 BIASS in the Estimation of Covariance Matrices and Optimality Criteria
Authors: Juan M. Rodriguez-Diaz
Abstract:
The precision of parameter estimators in the Gaussian linear model is traditionally accounted by the variance-covariance matrix of the asymptotic distribution. However, this measure can underestimate the true variance, specially for small samples. Traditionally, optimal design theory pays attention to this variance through its relationship with the model's information matrix. For this reason it seems convenient, at least in some cases, adapt the optimality criteria in order to get the best designs for the actual variance structure, otherwise the loss in efficiency of the designs obtained with the traditional approach may be very important.Keywords: correlated observations, information matrix, optimality criteria, variance-covariance matrix
Procedia PDF Downloads 44315260 The Bayesian Premium Under Entropy Loss
Authors: Farouk Metiri, Halim Zeghdoudi, Mohamed Riad Remita
Abstract:
Credibility theory is an experience rating technique in actuarial science which can be seen as one of quantitative tools that allows the insurers to perform experience rating, that is, to adjust future premiums based on past experiences. It is used usually in automobile insurance, worker's compensation premium, and IBNR (incurred but not reported claims to the insurer) where credibility theory can be used to estimate the claim size amount. In this study, we focused on a popular tool in credibility theory which is the Bayesian premium estimator, considering Lindley distribution as a claim distribution. We derive this estimator under entropy loss which is asymmetric and squared error loss which is a symmetric loss function with informative and non-informative priors. In a purely Bayesian setting, the prior distribution represents the insurer’s prior belief about the insured’s risk level after collection of the insured’s data at the end of the period. However, the explicit form of the Bayesian premium in the case when the prior is not a member of the exponential family could be quite difficult to obtain as it involves a number of integrations which are not analytically solvable. The paper finds a solution to this problem by deriving this estimator using numerical approximation (Lindley approximation) which is one of the suitable approximation methods for solving such problems, it approaches the ratio of the integrals as a whole and produces a single numerical result. Simulation study using Monte Carlo method is then performed to evaluate this estimator and mean squared error technique is made to compare the Bayesian premium estimator under the above loss functions.Keywords: bayesian estimator, credibility theory, entropy loss, monte carlo simulation
Procedia PDF Downloads 33315259 Laboratory Investigation of Alkali-Surfactant-Alternate Gas (ASAG) Injection – a Novel EOR Process for a Light Oil Sandstone Reservoir
Authors: Vidit Mohan, Ashwin P. Ramesh, Anirudh Toshniwal
Abstract:
Alkali-Surfactant-Alternate-Gas(ASAG) injection, a novel EOR process has the potential to improve displacement efficiency over Surfactant-Alternate-Gas(SAG) by addressing the problem of surfactant adsorption by clay minerals in rock matrix. A detailed laboratory investigation on ASAG injection process was carried out with encouraging results. To further enhance recovery over WAG injection process, SAG injection was investigated at laboratory scale. SAG injection yielded marginal incremental displacement efficiency over WAG process. On investigation, it was found that, clay minerals in rock matrix adsorbed the surfactants and were detrimental for SAG process. Hence, ASAG injection was conceptualized using alkali as a clay stabilizer. The experiment of ASAG injection with surfactant concentration of 5000 ppm and alkali concentration of 0.5 weight% yields incremental displacement efficiency of 5.42% over WAG process. The ASAG injection is a new process and has potential to enhance efficiency of WAG/SAG injection process.Keywords: alkali surfactant alternate gas (ASAG), surfactant alternate gas (SAG), laboratory investigation, EOR process
Procedia PDF Downloads 47915258 An Evaluation on the Methodology of Manufacturing High Performance Organophilic Clay at the Most Efficient and Cost Effective Process
Authors: Siti Nur Izati Azmi, Zatil Afifah Omar, Kathi Swaran, Navin Kumar
Abstract:
Organophilic Clays, also known as Organoclays, is used as a viscosifier in Oil based Drilling fluids. Most often, Organophilic clay are produced from modified Sodium and Calcium based Bentonite. Many studies and data show that Organophilic Clay using Hectorite based clays provide the best yield and good fluid loss properties in an oil-based drilling fluid at a higher cost. In terms of the manufacturing process, the two common methods of manufacturing organophilic clays are a Wet Process and a Dry Process. Wet process is known to produce better performance product at a higher cost while Dry Process shorten the production time. Hence, the purpose of this study is to evaluate the various formulation of an organophilic clay and its performance vs. the cost, as well as to determine the most efficient and cost-effective method of manufacturing organophilic clays.Keywords: organophilic clay, viscosifier, wet process, dry process
Procedia PDF Downloads 22615257 Metal-Oxide-Semiconductor-Only Process Corner Monitoring Circuit
Authors: Davit Mirzoyan, Ararat Khachatryan
Abstract:
A process corner monitoring circuit (PCMC) is presented in this work. The circuit generates a signal, the logical value of which depends on the process corner only. The signal can be used in both digital and analog circuits for testing and compensation of process variations (PV). The presented circuit uses only metal-oxide-semiconductor (MOS) transistors, which allow increasing its detection accuracy, decrease power consumption and area. Due to its simplicity the presented circuit can be easily modified to monitor parametrical variations of only n-type and p-type MOS (NMOS and PMOS, respectively) transistors, resistors, as well as their combinations. Post-layout simulation results prove correct functionality of the proposed circuit, i.e. ability to monitor the process corner (equivalently die-to-die variations) even in the presence of within-die variations.Keywords: detection, monitoring, process corner, process variation
Procedia PDF Downloads 52415256 Comprehensive Assessment of Energy Efficiency within the Production Process
Authors: S. Kreitlein, N. Eder, J. Franke
Abstract:
The importance of energy efficiency within the production process increases steadily. Unfortunately, so far no tools for a comprehensive assessment of energy efficiency within the production process exist. Therefore the Institute for Factory Automation and Production Systems of the Friedrich-Alexander-University Erlangen-Nuremberg has developed two methods with the goal of achieving transparency and a quantitative assessment of energy efficiency: EEV (Energy Efficiency Value) and EPE (Energetic Process Efficiency). This paper describes the basics and state of the art as well as the developed approaches.Keywords: energy efficiency, energy efficiency value, energetic process efficiency, production
Procedia PDF Downloads 73315255 Towards Incorporating Context Awareness into Business Process Management
Authors: Xiaohui Zhao, Shahan Mafuz
Abstract:
Context-aware technologies provide system applications with the awareness of environmental conditions, customer behaviour, object movements, etc. Further, with such capability system applications can be smart to adapt intelligently their responses to the changing conditions. Concerning business operations, this promises businesses that their business processes can run more intelligently, adaptively and flexibly, and thereby either improve customer experience, enhance reliability of service delivery, or lower operational cost, to make the business more competitive and sustainable. Aiming at realizing such context-aware business process management, this paper firstly explores its potential benefit and then identifies some gaps between the current business process management support and the expected. In addition, some preliminary solutions are also discussed with context definition, rule-based process execution, run-time process evolution, etc. A framework is also presented to give a conceptual architecture of context-aware business process management system to guide system implementation.Keywords: business process adaptation, business process evolution, business process modelling, and context awareness
Procedia PDF Downloads 41215254 Experience Report about the Inclusion of People with Disabilities in the Process of Testing an Accessible System for Learning Management
Authors: Marcos Devaner, Marcela Alves, Cledson Braga, Fabiano Alves, Wilton Bezerra
Abstract:
This article discusses the inclusion of people with disabilities in the process of testing an accessible system solution for distance education. The accessible system, team profile, methodologies and techniques covered in the testing process are presented. The testing process shown in this paper was designed from the experience with user. The testing process emerged from lessons learned from past experiences and the end user is present at all stages of the tests. Also, lessons learned are reported and how it was possible the maturing of the team and the methods resulting in a simple, productive and effective process.Keywords: experience report, accessible systems, software testing, testing process, systems, e-learning
Procedia PDF Downloads 39615253 BER Estimate of WCDMA Systems with MATLAB Simulation Model
Authors: Suyeb Ahmed Khan, Mahmood Mian
Abstract:
Simulation plays an important role during all phases of the design and engineering of communications systems, from early stages of conceptual design through the various stages of implementation, testing, and fielding of the system. In the present paper, a simulation model has been constructed for the WCDMA system in order to evaluate the performance. This model describes multiusers effects and calculation of BER (Bit Error Rate) in 3G mobile systems using Simulink MATLAB 7.1. Gaussian Approximation defines the multi-user effect on system performance. BER has been analyzed with comparison between transmitting data and receiving data.Keywords: WCDMA, simulations, BER, MATLAB
Procedia PDF Downloads 59215252 A Fourier Method for Risk Quantification and Allocation of Credit Portfolios
Authors: Xiaoyu Shen, Fang Fang, Chujun Qiu
Abstract:
Herewith we present a Fourier method for credit risk quantification and allocation in the factor-copula model framework. The key insight is that, compared to directly computing the cumulative distribution function of the portfolio loss via Monte Carlo simulation, it is, in fact, more efficient to calculate the transformation of the distribution function in the Fourier domain instead and inverting back to the real domain can be done in just one step and semi-analytically, thanks to the popular COS method (with some adjustments). We also show that the Euler risk allocation problem can be solved in the same way since it can be transformed into the problem of evaluating a conditional cumulative distribution function. Once the conditional or unconditional cumulative distribution function is known, one can easily calculate various risk metrics. The proposed method not only fills the niche in literature, to the best of our knowledge, of accurate numerical methods for risk allocation but may also serve as a much faster alternative to the Monte Carlo simulation method for risk quantification in general. It can cope with various factor-copula model choices, which we demonstrate via examples of a two-factor Gaussian copula and a two-factor Gaussian-t hybrid copula. The fast error convergence is proved mathematically and then verified by numerical experiments, in which Value-at-Risk, Expected Shortfall, and conditional Expected Shortfall are taken as examples of commonly used risk metrics. The calculation speed and accuracy are tested to be significantly superior to the MC simulation for real-sized portfolios. The computational complexity is, by design, primarily driven by the number of factors instead of the number of obligors, as in the case of Monte Carlo simulation. The limitation of this method lies in the "curse of dimension" that is intrinsic to multi-dimensional numerical integration, which, however, can be relaxed with the help of dimension reduction techniques and/or parallel computing, as we will demonstrate in a separate paper. The potential application of this method has a wide range: from credit derivatives pricing to economic capital calculation of the banking book, default risk charge and incremental risk charge computation of the trading book, and even to other risk types than credit risk.Keywords: credit portfolio, risk allocation, factor copula model, the COS method, Fourier method
Procedia PDF Downloads 16615251 Quantification of Dispersion Effects in Arterial Spin Labelling Perfusion MRI
Authors: Rutej R. Mehta, Michael A. Chappell
Abstract:
Introduction: Arterial spin labelling (ASL) is an increasingly popular perfusion MRI technique, in which arterial blood water is magnetically labelled in the neck before flowing into the brain, providing a non-invasive measure of cerebral blood flow (CBF). The accuracy of ASL CBF measurements, however, is hampered by dispersion effects; the distortion of the ASL labelled bolus during its transit through the vasculature. In spite of this, the current recommended implementation of ASL – the white paper (Alsop et al., MRM, 73.1 (2015): 102-116) – does not account for dispersion, which leads to the introduction of errors in CBF. Given that the transport time from the labelling region to the tissue – the arterial transit time (ATT) – depends on the region of the brain and the condition of the patient, it is likely that these errors will also vary with the ATT. In this study, various dispersion models are assessed in comparison with the white paper (WP) formula for CBF quantification, enabling the errors introduced by the WP to be quantified. Additionally, this study examines the relationship between the errors associated with the WP and the ATT – and how this is influenced by dispersion. Methods: Data were simulated using the standard model for pseudo-continuous ASL, along with various dispersion models, and then quantified using the formula in the WP. The ATT was varied from 0.5s-1.3s, and the errors associated with noise artefacts were computed in order to define the concept of significant error. The instantaneous slope of the error was also computed as an indicator of the sensitivity of the error with fluctuations in ATT. Finally, a regression analysis was performed to obtain the mean error against ATT. Results: An error of 20.9% was found to be comparable to that introduced by typical measurement noise. The WP formula was shown to introduce errors exceeding 20.9% for ATTs beyond 1.25s even when dispersion effects were ignored. Using a Gaussian dispersion model, a mean error of 16% was introduced by using the WP, and a dispersion threshold of σ=0.6 was determined, beyond which the error was found to increase considerably with ATT. The mean error ranged from 44.5% to 73.5% when other physiologically plausible dispersion models were implemented, and the instantaneous slope varied from 35 to 75 as dispersion levels were varied. Conclusion: It has been shown that the WP quantification formula holds only within an ATT window of 0.5 to 1.25s, and that this window gets narrower as dispersion occurs. Provided that the dispersion levels fall below the threshold evaluated in this study, however, the WP can measure CBF with reasonable accuracy if dispersion is correctly modelled by the Gaussian model. However, substantial errors were observed with other common models for dispersion with dispersion levels similar to those that have been observed in literature.Keywords: arterial spin labelling, dispersion, MRI, perfusion
Procedia PDF Downloads 36915250 Development of new Ecological Cleaning Process of Metal Sheets
Authors: L. M. López López, J. V. Montesdeoca Contreras, A. R. Cuji Fajardo, L. E. Garzón Muñoz, J. I. Fajardo Seminario
Abstract:
In this article a new method of cleaning process of metal sheets for household appliances was developed, using low-pressure cold plasma. In this context, this research consist in analyze the results of metal sheets cleaning process using plasma and compare with pickling process to determinate the efficiency of each process and the level of contamination produced. Surface Cleaning was evaluated by measuring the contact angle with deionized water, diiodo methane and ethylene glycol, for the calculus of the surface free energy by means of the Fowkes theories and Wu. Showing that low-pressure cold plasma is very efficient both in cleaning process how in environment impact.Keywords: efficient use of plasma, ecological impact of plasma, metal sheets cleaning means, plasma cleaning process.
Procedia PDF Downloads 35415249 Case-Based Reasoning Approach for Process Planning of Internal Thread Cold Extrusion
Authors: D. Zhang, H. Y. Du, G. W. Li, J. Zeng, D. W. Zuo, Y. P. You
Abstract:
For the difficult issues of process selection, case-based reasoning technology is applied to computer aided process planning system for cold form tapping of internal threads on the basis of similarity in the process. A model is established based on the analysis of process planning. Case representation and similarity computing method are given. Confidence degree is used to evaluate the case. Rule-based reuse strategy is presented. The scheme is illustrated and verified by practical application. The case shows the design results with the proposed method are effective.Keywords: case-based reasoning, internal thread, cold extrusion, process planning
Procedia PDF Downloads 51015248 Nonlinear Modelling of Sloshing Waves and Solitary Waves in Shallow Basins
Authors: Mohammad R. Jalali, Mohammad M. Jalali
Abstract:
The earliest theories of sloshing waves and solitary waves based on potential theory idealisations and irrotational flow have been extended to be applicable to more realistic domains. To this end, the computational fluid dynamics (CFD) methods are widely used. Three-dimensional CFD methods such as Navier-Stokes solvers with volume of fluid treatment of the free surface and Navier-Stokes solvers with mappings of the free surface inherently impose high computational expense; therefore, considerable effort has gone into developing depth-averaged approaches. Examples of such approaches include Green–Naghdi (GN) equations. In Cartesian system, GN velocity profile depends on horizontal directions, x-direction and y-direction. The effect of vertical direction (z-direction) is also taken into consideration by applying weighting function in approximation. GN theory considers the effect of vertical acceleration and the consequent non-hydrostatic pressure. Moreover, in GN theory, the flow is rotational. The present study illustrates the application of GN equations to propagation of sloshing waves and solitary waves. For this purpose, GN equations solver is verified for the benchmark tests of Gaussian hump sloshing and solitary wave propagation in shallow basins. Analysis of the free surface sloshing of even harmonic components of an initial Gaussian hump demonstrates that the GN model gives predictions in satisfactory agreement with the linear analytical solutions. Discrepancies between the GN predictions and the linear analytical solutions arise from the effect of wave nonlinearities arising from the wave amplitude itself and wave-wave interactions. Numerically predicted solitary wave propagation indicates that the GN model produces simulations in good agreement with the analytical solution of the linearised wave theory. Comparison between the GN model numerical prediction and the result from perturbation analysis confirms that nonlinear interaction between solitary wave and a solid wall is satisfactorilly modelled. Moreover, solitary wave propagation at an angle to the x-axis and the interaction of solitary waves with each other are conducted to validate the developed model.Keywords: Green–Naghdi equations, nonlinearity, numerical prediction, sloshing waves, solitary waves
Procedia PDF Downloads 28415247 Concept Drifts Detection and Localisation in Process Mining
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining
Procedia PDF Downloads 34515246 Simulation of Laser Structuring by Three Dimensional Heat Transfer Model
Authors: Bassim Shaheen Bachy, Jörg Franke
Abstract:
In this study, a three dimensional numerical heat transfer model has been used to simulate the laser structuring of polymer substrate material in the Three-Dimensional Molded Interconnect Device (3D MID) which is used in the advanced multi-functional applications. A finite element method (FEM) transient thermal analysis is performed using APDL (ANSYS Parametric Design Language) provided by ANSYS. In this model, the effect of surface heat source was modeled with Gaussian distribution, also the effect of the mixed boundary conditions which consist of convection and radiation heat transfers have been considered in this analysis. The model provides a full description of the temperature distribution, as well as calculates the depth and the width of the groove upon material removal at different set of laser parameters such as laser power and laser speed. This study also includes the experimental procedure to study the effect of laser parameters on the depth and width of the removal groove metal as verification to the modeled results. Good agreement between the experimental and the model results is achieved for a wide range of laser powers. It is found that the quality of the laser structure process is affected by the laser scan speed and laser power. For a high laser structured quality, it is suggested to use laser with high speed and moderate to high laser power.Keywords: laser structuring, simulation, finite element analysis, thermal modeling
Procedia PDF Downloads 34815245 The Evaluation of the Performance of Different Filtering Approaches in Tracking Problem and the Effect of Noise Variance
Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri
Abstract:
Performance of different filtering approaches depends on modeling of dynamical system and algorithm structure. For modeling and smoothing the data the evaluation of posterior distribution in different filtering approach should be chosen carefully. In this paper different filtering approaches like filter KALMAN, EKF, UKF, EKS and smoother RTS is simulated in some trajectory tracking of path and accuracy and limitation of these approaches are explained. Then probability of model with different filters is compered and finally the effect of the noise variance to estimation is described with simulations results.Keywords: Gaussian approximation, Kalman smoother, parameter estimation, noise variance
Procedia PDF Downloads 43915244 Mixed Model Sequencing in Painting Production Line
Authors: Unchalee Inkampa, Tuanjai Somboonwiwat
Abstract:
Painting process of automobiles and automobile parts, which is a continuous process based on EDP (Electrode position paint, EDP). Through EDP, all work pieces will be continuously sent to the painting process. Work process can be divided into 2 groups based on the running time: Painting Room 1 and Painting Room 2. This leads to continuous operation. The problem that arises is waiting for workloads onto Painting Room. The grading process EDP to Painting Room is a major problem. Therefore, this paper aim to develop production sequencing method by applying EDP to painting process. It also applied fixed rate launching for painting room and earliest due date (EDD) for EDP process and swap pairwise interchange for waiting time to a minimum of machine. The result found that the developed method could improve painting reduced waiting time, on time delivery, meeting customers wants and improved productivity of painting unit.Keywords: sequencing, mixed model lines, painting process, electrode position paint
Procedia PDF Downloads 42015243 Trace Logo: A Notation for Representing Control-Flow of Operational Process
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining research discipline bridges the gap between data mining and business process modeling and analysis, it offers the process-centric and end-to-end methods/techniques for analyzing information of real-world process detailed in operational event-logs. In this paper, we have proposed a notation called trace logo for graphically representing control-flow perspective (order of execution of activities) of process. A trace logo consists of a stack of activity names at each position, sizes of the activity name indicates their frequency in the traces and the total height of the activity depicts the information content of the position. A trace logo created from a set of aligned traces generated using Multiple Trace Alignment technique.Keywords: consensus trace, process mining, multiple trace alignment, trace logo
Procedia PDF Downloads 34815242 Axiomatic Design of Laser Beam Machining Process
Authors: Nikhil Deshpande, Rahul Mahajan
Abstract:
Laser Beam Machining (LBM) is a non-traditional machining process that has inherent problems like dross, striation, and Heat Affected Zone (HAZ) which reduce the quality of machining. In the present day scenario, these problems are controlled only by iteratively adjusting a large number of process parameters. This paper applies Axiomatic Design principles to design LBM process so as to eliminate the problem of dross and striation and minimize the effect of HAZ. Process parameters and their ranges are proposed to set-up the LBM process, execute the cut and finish the workpiece so as to obtain the best quality cut.Keywords: laser beam machining, dross, striation, heat affected zone, axiomatic design
Procedia PDF Downloads 37015241 Process Modeling of Electric Discharge Machining of Inconel 825 Using Artificial Neural Network
Authors: Himanshu Payal, Sachin Maheshwari, Pushpendra S. Bharti
Abstract:
Electrical discharge machining (EDM), a non-conventional machining process, finds wide applications for shaping difficult-to-cut alloys. Process modeling of EDM is required to exploit the process to the fullest. Process modeling of EDM is a challenging task owing to involvement of so many electrical and non-electrical parameters. This work is an attempt to model the EDM process using artificial neural network (ANN). Experiments were carried out on die-sinking EDM taking Inconel 825 as work material. ANN modeling has been performed using experimental data. The prediction ability of trained network has been verified experimentally. Results indicate that ANN can predict the values of performance measures of EDM satisfactorily.Keywords: artificial neural network, EDM, metal removal rate, modeling, surface roughness
Procedia PDF Downloads 41215240 Programming Systems in Implementation of Process Safety at Chemical Process Industry
Authors: Maryam Shayan
Abstract:
Programming frameworks have been utilized as a part of chemical industry process safety operation and configuration to enhance its effectiveness. This paper gives a brief survey and investigation of the best in class and effects of programming frameworks in process security. A study was completed by talking staff accountable for procedure wellbeing practices in the Iranian chemical process industry and diving into writing of innovation for procedure security. This article investigates the useful and operational attributes of programming frameworks for security and endeavors to sort the product as indicated by its level of effect in the administration chain of importance. The study adds to better comprehension of the parts of Information Communication Technology in procedure security, the future patterns and conceivable gaps for innovative work.Keywords: programming frameworks, chemical industry process, process security, administration chain, information communication technology
Procedia PDF Downloads 37215239 Overcoming Obstacles in UHTHigh-protein Whey Beverages by Microparticulation Process: Scientific and Technological Aspects
Authors: Shahram Naghizadeh Raeisi, Ali Alghooneh, Seyed Jalal Razavi Zahedkolaei
Abstract:
Herein, a shelf stable (no refrigeration required) UHT processed, aseptically packaged whey protein drink was formulated by using a new strategy in microparticulate process. Applying thermal and two-dimensional mechanical treatments simultaneously, a modified protein (MWPC-80) was produced. Then the physical, thermal and thermodynamic properties of MWPC-80 were assessed using particle size analysis, dynamic temperature sweep (DTS), and differential scanning calorimetric (DSC) tests. Finally, using MWPC-80, a new RTD beverage was formulated, and shelf stability was assessed for three months at ambient temperature (25 °C). Non-isothermal dynamic temperature sweep was performed, and the results were analyzed by a combination of classic rate equation, Arrhenius equation, and time-temperature relationship. Generally, results showed that temperature dependency of the modified sample was significantly (Pvalue<0.05) less than the control one contained WPC-80. The changes in elastic modulus of the MWPC did not show any critical point at all the processed stages, whereas, the control sample showed two critical points during heating (82.5 °C) and cooling (71.10 °C) stages. Thermal properties of samples (WPC-80 & MWPC-80) were assessed using DSC with 4 °C /min heating speed at 20-90 °C heating range. Results did not show any thermal peak in MWPC DSC curve, which suggested high thermal resistance. On the other hands, WPC-80 sample showed a significant thermal peak with thermodynamic properties of ∆G:942.52 Kj/mol ∆H:857.04 Kj/mole and ∆S:-1.22Kj/mole°K. Dynamic light scattering was performed and results showed 0.7 µm and 15 nm average particle size for MWPC-80 and WPC-80 samples, respectively. Moreover, particle size distribution of MWPC-80 and WPC-80 were Gaussian-Lutresian and normal, respectively. After verification of microparticulation process by DTS, PSD and DSC analyses, a 10% why protein beverage (10% w/w/ MWPC-80, 0.6% w/w vanilla flavoring agent, 0.1% masking flavor, 0.05% stevia natural sweetener and 0.25% citrate buffer) was formulated and UHT treatment was performed at 137 °C and 4 s. Shelf life study did not show any jellification or precipitation of MWPC-80 contained beverage during three months storage at ambient temperature, whereas, WPC-80 contained beverage showed significant precipitation and jellification after thermal processing, even at 3% w/w concentration. Consumer knowledge on nutritional advantages of whey protein increased the request for using this protein in different food systems especially RTD beverages. These results could make a huge difference in this industry.Keywords: high protein whey beverage, micropartiqulation, two-dimentional mechanical treatments, thermodynamic properties
Procedia PDF Downloads 73