Search results for: Poisson process
5481 Zero Inflated Models for Overdispersed Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
The zero inflated models are usually used in modeling count data with excess zeros where the existence of the excess zeros could be structural zeros or zeros which occur by chance. These type of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences which involve sex and health dental epidemiology. The most popular zero inflated models used by many researchers are zero inflated Poisson and zero inflated negative binomial models. In addition, zero inflated generalized Poisson and zero inflated double Poisson models are also discussed and found in some literature. Recently zero inflated inverse trinomial model and zero inflated strict arcsine models are advocated and proven to serve as alternative models in modeling overdispersed count data caused by excessive zeros and unobserved heterogeneity. The purpose of this paper is to review some related literature and provide a variety of examples from different disciplines in the application of zero inflated models. Different model selection methods used in model comparison are discussed.
Keywords: Overdispersed count data, model selection methods, likelihood ratio, AIC, BIC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45325480 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia
Authors: N. A. Samat, S. H. Mohd Imam Ma’arof
Abstract:
Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.
Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32955479 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8365478 Analyzing the Factors Influencing Exclusive Breastfeeding Using the Generalized Poisson Regression Model
Authors: Cheika Jahangeer, Naushad Mamode Khan, Maleika Heenaye-Mamode Khan
Abstract:
Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is of fundamental importance because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in developed countries, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we study the factors that influence exclusive breastfeeding and use the Generalized Poisson regression model to analyze the practices of exclusive breastfeeding in Mauritius. We develop two sets of quasi-likelihood equations (QLE)to estimate the parameters.
Keywords: Exclusive breastfeeding, Regression model, Quasilikelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18015477 Comparison of Stochastic Point Process Models of Rainfall in Singapore
Abstract:
Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.
Keywords: Rainfall disaggregation, statistical properties, poisson processed, Bartlett-Lewis model, Neyman-Scott model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22825476 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model
Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah
Abstract:
Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28165475 A Study on Exclusive Breastfeeding using Over-dispersed Statistical Models
Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan
Abstract:
Breastfeeding is an important concept in the maternal life of a woman. In this paper, we focus on exclusive breastfeeding. Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. This type of breastfeeding is very important during the first six months because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in Mauritius, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we give an overview of exclusive breastfeeding in Mauritius and the factors influencing it. We further analyze the local practices of exclusive breastfeeding using the Generalized Poisson regression model and the negative-binomial model since the data are over-dispersed.
Keywords: Exclusive breast feeding, regression model, generalized poisson, negative binomial.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16035474 Structural Investigation of Na2O–B2O3–SiO2 Glasses Doped with NdF3
Authors: M. S. Gaafar, S. Y. Marzouk
Abstract:
Sodium borosilicate glasses doped with different content of NdF3 mol % have been prepared by rapid quenching method. Ultrasonic velocities (both longitudinal and shear) measurements have been carried out at room temperature and at ultrasonic frequency of 4 MHz. Elastic moduli, Debye temperature, softening temperature and Poisson's ratio have been obtained as a function of NdF3 modifier content. Results showed that the elastic moduli, Debye temperature, softening temperature and Poisson's ratio have very slight change with the change of NdF3 mol % content. Based on FTIR spectroscopy and theoretical (Bond compression) model, quantitative analysis has been carried out in order to obtain more information about the structure of these glasses. The study indicated that the structure of these glasses is mainly composed of SiO4 units with four bridging oxygens (Q4), and with three bridging and one nonbridging oxygens (Q3).Keywords: Borosilicate glasses, ultrasonic velocity, elastic moduli, FTIR spectroscopy, bond compression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17525473 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand
Authors: Leila Jafari, Viliam Makis
Abstract:
In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.Keywords: Condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8335472 Reliability Modeling and Data Analysis of Vacuum Circuit Breaker Subject to Random Shocks
Authors: Rafik Medjoudj, Rabah Medjoudj, D. Aissani
Abstract:
The electrical substation components are often subject to degradation due to over-voltage or over-current, caused by a short circuit or a lightning. A particular interest is given to the circuit breaker, regarding the importance of its function and its dangerous failure. This component degrades gradually due to the use, and it is also subject to the shock process resulted from the stress of isolating the fault when a short circuit occurs in the system. In this paper, based on failure mechanisms developments, the wear out of the circuit breaker contacts is modeled. The aim of this work is to evaluate its reliability and consequently its residual lifetime. The shock process is based on two random variables such as: the arrival of shocks and their magnitudes. The arrival of shocks was modeled using homogeneous Poisson process (HPP). By simulation, the dates of short-circuit arrivals were generated accompanied with their magnitudes. The same principle of simulation is applied to the amount of cumulative wear out contacts. The objective reached is to find the formulation of the wear function depending on the number of solicitations of the circuit breaker.
Keywords: reliability, short-circuit, models of shocks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19395471 Optimal Parameters of Double Moving Average Control Chart
Authors: Y. Areepong
Abstract:
The objective of this paper is to present explicit analytical formulas for evaluating important characteristics of Double Moving Average control chart (DMA) for Poisson distribution. The most popular characteristics of a control chart are Average Run Length ( 0 ARL ) - the mean of observations that are taken before a system is signaled to be out-of control when it is actually still incontrol, and Average Delay time ( 1 ARL ) - mean delay of true alarm times. An important property required of 0 ARL is that it should be sufficiently large when the process is in-control to reduce a number of false alarms. On the other side, if the process is actually out-ofcontrol then 1 ARL should be as small as possible. In particular, the explicit analytical formulas for evaluating 0 ARL and 1 ARL be able to get a set of optimal parameters which depend on a width of the moving average ( w ) and width of control limit ( H ) for designing DMA chart with minimum of 1 ARLKeywords: Optimal parameters, Average Run Length, Average Delay time, Double Moving Average chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23275470 Mathematical Modeling of the Influence of Hydrothermal Processes in the Water Reservoir
Authors: Alibek Issakhov
Abstract:
In this paper presents the mathematical model of hydrothermal processes in thermal power plant with different wind direction scenarios in the water reservoir, which is solved by the Navier - Stokes and temperature equations for an incompressible fluid in a stratified medium. Numerical algorithm based on the method of splitting by physical parameters. Three dimensional Poisson equation is solved with Fourier method by combination of tridiagonal matrix method (Thomas algorithm).Keywords: thermal power plant, hydrothermal process, large eddy simulation, water reservoir
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16455469 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data
Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone
Abstract:
This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease dataset, the study successfully identified key factors, and the results were consistent with previous studies.
Keywords: Lyme disease, Poisson generalized linear model, Ridge regression, Lasso Regression, elastic net regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1255468 ABURAS Index: A Statistically Developed Index for Dengue-Transmitting Vector Population Prediction
Authors: Hani M. Aburas
Abstract:
“Dengue" is an African word meaning “bone breaking" because it causes severe joint and muscle pain that feels like bones are breaking. It is an infectious disease mainly transmitted by female mosquito, Aedes aegypti, and causes four serotypes of dengue viruses. In recent years, a dramatic increase in the dengue fever confirmed cases around the equator-s belt has been reported. Several conventional indices have been designed so far to monitor the transmitting vector populations known as House Index (HI), Container Index (CI), Breteau Index (BI). However, none of them describes the adult mosquito population size which is important to direct and guide comprehensive control strategy operations since number of infected people has a direct relationship with the vector density. Therefore, it is crucial to know the population size of the transmitting vector in order to design a suitable and effective control program. In this context, a study is carried out to report a new statistical index, ABURAS Index, using Poisson distribution based on the collection of vector population in Jeddah Governorate, Saudi Arabia.Keywords: Poisson distribution, statistical index, prediction, Aedes aegypti.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19185467 Classification of Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach
Authors: Henry J. Wattimanela, Udjianna S. Pasaribu, Nanang T. Puspito, Sapto W. Indratno
Abstract:
Banda Sea Collision Zone (BSCZ) is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location is located in eastern Indonesia. This zone has a very high seismic activity. In this research, we will calculate the rate (λ) and Mean Square Error (MSE). By this result, we will classification earthquakes distribution in the BSCZ with the point process approach. Chi-square is used to determine the type of earthquakes distribution in the sub region of BSCZ. The data used in this research is data of earthquakes with a magnitude ≥ 6 SR for the period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.Keywords: Banda sea collision zone, earthquakes, mean square error, Poisson distribution, chi-square test.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21185466 Vickers Indentation Simulation of Buffer Layer Thickness Effect for DLC Coated Materials
Authors: Abdul Wasy, Balakrishnan G., Yi Qi Wang, Atta Ur Rehman, Jung Il Song
Abstract:
Vickers indentation is used to measure the hardness of materials. In this study, numerical simulation of Vickers indentation experiment was performed for Diamond like Carbon (DLC) coated materials. DLC coatings were deposited on stainless steel 304 substrates with Chromium buffer layer using RF Magnetron and T-shape Filtered Cathodic Vacuum Arc Dual system The objective of this research is to understand the elastic plastic properties, stress strain distribution, ring and lateral crack growth and propagation, penetration depth of indenter and delamination of coating from substrate with effect of buffer layer thickness. The effect of Poisson-s ratio of DLC coating was also analyzed. Indenter penetration is more in coated materials with thin buffer layer as compared to thicker one, under same conditions. Similarly, the specimens with thinner buffer layer failed quickly due to high residual stress as compared to the coated materials with reasonable thickness of 200nm buffer layer. The simulation results suggested the optimized thickness of 200 nm among the prepared specimens for durable and long service.Keywords: Thin film, buffer layer. Diamond like Carbon, Vickers indentation, Poisson's ratio, Finite element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29365465 Using Artificial Neural Network to Predict Collisions on Horizontal Tangents of 3D Two-Lane Highways
Authors: Omer F. Cansiz, Said M. Easa
Abstract:
The purpose of this study is mainly to predict collision frequency on the horizontal tangents combined with vertical curves using artificial neural network methods. The proposed ANN models are compared with existing regression models. First, the variables that affect collision frequency were investigated. It was found that only the annual average daily traffic, section length, access density, the rate of vertical curvature, smaller curve radius before and after the tangent were statistically significant according to related combinations. Second, three statistical models (negative binomial, zero inflated Poisson and zero inflated negative binomial) were developed using the significant variables for three alignment combinations. Third, ANN models are developed by applying the same variables for each combination. The results clearly show that the ANN models have the lowest mean square error value than those of the statistical models. Similarly, the AIC values of the ANN models are smaller to those of the regression models for all the combinations. Consequently, the ANN models have better statistical performances than statistical models for estimating collision frequency. The ANN models presented in this paper are recommended for evaluating the safety impacts 3D alignment elements on horizontal tangents.Keywords: Collision frequency, horizontal tangent, 3D two-lane highway, negative binomial, zero inflated Poisson, artificial neural network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16385464 Mathematical Study for Traffic Flow and Traffic Density in Kigali Roads
Authors: Kayijuka Idrissa
Abstract:
This work investigates a mathematical study for traffic flow and traffic density in Kigali city roads and the data collected from the national police of Rwanda in 2012. While working on this topic, some mathematical models were used in order to analyze and compare traffic variables. This work has been carried out on Kigali roads specifically at roundabouts from Kigali Business Center (KBC) to Prince House as our study sites. In this project, we used some mathematical tools to analyze the data collected and to understand the relationship between traffic variables. We applied the Poisson distribution method to analyze and to know the number of accidents occurred in this section of the road which is from KBC to Prince House. The results show that the accidents that occurred in 2012 were at very high rates due to the fact that this section has a very narrow single lane on each side which leads to high congestion of vehicles, and consequently, accidents occur very frequently. Using the data of speeds and densities collected from this section of road, we found that the increment of the density results in a decrement of the speed of the vehicle. At the point where the density is equal to the jam density the speed becomes zero. The approach is promising in capturing sudden changes on flow patterns and is open to be utilized in a series of intelligent management strategies and especially in noncurrent congestion effect detection and control.
Keywords: Statistical methods, Poisson distribution, car moving techniques, traffic flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18205463 Sub-Impact Phenomenon of Elasto-Plastic Free-Free Beam during a Strike
Authors: H. Rong, X. C. Yin, J. Yang, Y. N. Shen
Abstract:
Based on Rayleigh beam theory, the sub-impacts of a free-free beam struck horizontally by a round-nosed rigid mass is simulated by the finite difference method and the impact-separation conditions. In order to obtain the sub-impact force, a uniaxial compression elastic-plastic contact model is employed to analyze the local deformation field on contact zone. It is found that the horizontal impact is a complicated process including the elastic plastic sub-impacts in sequence. There are two sub-zones of sub-impact. In addition, it found that the elastic energy of the free-free beam is more suitable for the Poisson collision hypothesis to explain compression and recovery processes.Keywords: beam, sub-impact, elastic-plastic deformation, finite difference method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18525462 Very-high-Precision Normalized Eigenfunctions for a Class of Schrödinger Type Equations
Authors: Amna Noreen , Kare Olaussen
Abstract:
We demonstrate that it is possible to compute wave function normalization constants for a class of Schr¨odinger type equations by an algorithm which scales linearly (in the number of eigenfunction evaluations) with the desired precision P in decimals.
Keywords: Eigenvalue problems, bound states, trapezoidal rule, poisson resummation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28555461 Statistical Modeling of Constituents in Ash Evolved From Pulverized Coal Combustion
Authors: Esam Jassim
Abstract:
Industries using conventional fossil fuels have an interest in better understanding the mechanism of particulate formation during combustion since such is responsible for emission of undesired inorganic elements that directly impact the atmospheric pollution level. Fine and ultrafine particulates have tendency to escape the flue gas cleaning devices to the atmosphere. They also preferentially collect on surfaces in power systems resulting in ascending in corrosion inclination, descending in the heat transfer thermal unit, and severe impact on human health. This adverseness manifests particularly in the regions of world where coal is the dominated source of energy for consumption. This study highlights the behavior of calcium transformation as mineral grains verses organically associated inorganic components during pulverized coal combustion. The influence of existing type of calcium on the coarse, fine and ultrafine mode formation mechanisms is also presented. The impact of two sub-bituminous coals on particle size and calcium composition evolution during combustion is to be assessed. Three mixed blends named Blends 1, 2, and 3 are selected according to the ration of coal A to coal B by weight. Calcium percentage in original coal increases as going from Blend 1 to 3. A mathematical model and a new approach of describing constituent distribution are proposed. Analysis of experiments of calcium distribution in ash is also modeled using Poisson distribution. A novel parameter, called elemental index λ, is introduced as a measuring factor of element distribution. Results show that calcium in ash that originally in coal as mineral grains has index of 17, whereas organically associated calcium transformed to fly ash shown to be best described when elemental index λ is 7. As an alkaline-earth element, calcium is considered the fundamental element responsible for boiler deficiency since it is the major player in the mechanism of ash slagging process. The mechanism of particle size distribution and mineral species of ash particles are presented using CCSEM and size-segregated ash characteristics. Conclusions are drawn from the analysis of pulverized coal ash generated from a utility-scale boiler.
Keywords: Calcium transformation, Coal Combustion, Inorganic Element, Poisson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19575460 Object-Centric Process Mining Using Process Cubes
Authors: Anahita Farhang Ghahfarokhi, Alessandro Berti, Wil M.P. van der Aalst
Abstract:
Process mining provides ways to analyze business processes. Common process mining techniques consider the process as a whole. However, in real-life business processes, different behaviors exist that make the overall process too complex to interpret. Process comparison is a branch of process mining that isolates different behaviors of the process from each other by using process cubes. Process cubes organize event data using different dimensions. Each cell contains a set of events that can be used as an input to apply process mining techniques. Existing work on process cubes assume single case notions. However, in real processes, several case notions (e.g., order, item, package, etc.) are intertwined. Object-centric process mining is a new branch of process mining addressing multiple case notions in a process. To make a bridge between object-centric process mining and process comparison, we propose a process cube framework, which supports process cube operations such as slice and dice on object-centric event logs. To facilitate the comparison, the framework is integrated with several object-centric process discovery approaches.Keywords: Process mining, multidimensional process mining, multi-perspective business processes, OLAP, process cubes, process discovery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11195459 Application of an Analytical Model to Obtain Daily Flow Duration Curves for Different Hydrological Regimes in Switzerland
Authors: Ana Clara Santos, Maria Manuela Portela, Bettina Schaefli
Abstract:
This work assesses the performance of an analytical model framework to generate daily flow duration curves, FDCs, based on climatic characteristics of the catchments and on their streamflow recession coefficients. According to the analytical model framework, precipitation is considered to be a stochastic process, modeled as a marked Poisson process, and recession is considered to be deterministic, with parameters that can be computed based on different models. The analytical model framework was tested for three case studies with different hydrological regimes located in Switzerland: pluvial, snow-dominated and glacier. For that purpose, five time intervals were analyzed (the four meteorological seasons and the civil year) and two developments of the model were tested: one considering a linear recession model and the other adopting a nonlinear recession model. Those developments were combined with recession coefficients obtained from two different approaches: forward and inverse estimation. The performance of the analytical framework when considering forward parameter estimation is poor in comparison with the inverse estimation for both, linear and nonlinear models. For the pluvial catchment, the inverse estimation shows exceptional good results, especially for the nonlinear model, clearing suggesting that the model has the ability to describe FDCs. For the snow-dominated and glacier catchments the seasonal results are better than the annual ones suggesting that the model can describe streamflows in those conditions and that future efforts should focus on improving and combining seasonal curves instead of considering single annual ones.Keywords: Analytical streamflow distribution, stochastic process, linear and non-linear recession, hydrological modelling, daily discharges.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6465458 On Four Models of a Three Server Queue with Optional Server Vacations
Authors: Kailash C. Madan
Abstract:
We study four models of a three server queueing system with Bernoulli schedule optional server vacations. Customers arriving at the system one by one in a Poisson process are provided identical exponential service by three parallel servers according to a first-come, first served queue discipline. In model A, all three servers may be allowed a vacation at one time, in Model B at the most two of the three servers may be allowed a vacation at one time, in model C at the most one server is allowed a vacation, and in model D no server is allowed a vacation. We study steady the state behavior of the four models and obtain steady state probability generating functions for the queue size at a random point of time for all states of the system. In model D, a known result for a three server queueing system without server vacations is derived.Keywords: A three server queue, Bernoulli schedule server vacations, queue size distribution at a random epoch, steady state.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13865457 Methods for Material and Process Monitoring by Characterization of (Second and Third Order) Elastic Properties with Lamb Waves
Abstract:
In accordance with the industry 4.0 concept, manufacturing process steps as well as the materials themselves are going to be more and more digitalized within the next years. The “digital twin” representing the simulated and measured dataset of the (semi-finished) product can be used to control and optimize the individual processing steps and help to reduce costs and expenditure of time in product development, manufacturing, and recycling. In the present work, two material characterization methods based on Lamb waves were evaluated and compared. For demonstration purpose, both methods were shown at a standard industrial product - copper ribbons, often used in photovoltaic modules as well as in high-current microelectronic devices. By numerical approximation of the Rayleigh-Lamb dispersion model on measured phase velocities second order elastic constants (Young’s modulus, Poisson’s ratio) were determined. Furthermore, the effective third order elastic constants were evaluated by applying elastic, “non-destructive”, mechanical stress on the samples. In this way, small microstructural variations due to mechanical preconditioning could be detected for the first time. Both methods were compared with respect to precision and inline application capabilities. Microstructure of the samples was systematically varied by mechanical loading and annealing. Changes in the elastic ultrasound transport properties were correlated with results from microstructural analysis and mechanical testing. In summary, monitoring the elastic material properties of plate-like structures using Lamb waves is valuable for inline and non-destructive material characterization and manufacturing process control. Second order elastic constants analysis is robust over wide environmental and sample conditions, whereas the effective third order elastic constants highly increase the sensitivity with respect to small microstructural changes. Both Lamb wave based characterization methods are fitting perfectly into the industry 4.0 concept.
Keywords: Lamb waves, industry 4.0, process control, elasticity, acoustoelasticity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10995456 Critical Assessment of Scoring Schemes for Protein-Protein Docking Predictions
Authors: Dhananjay C. Joshi, Jung-Hsin Lin
Abstract:
Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.
Keywords: protein-protein docking, protein-protein interaction, molecular mechanics energetics, Poisson-Boltzmann calculations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18055455 Simulation of Co2 Capture Process
Authors: K. Movagharnejad, M. Akbari
Abstract:
Carbon dioxide capture process has been simulated and studied under different process conditions. It has been shown that several process parameters such as lean amine temperature, number of adsorber stages, number of stripper stages and stripper pressure affect different process conditions and outputs such as carbon dioxide removal and reboiler duty. It may be concluded that the simulation of carbon dioxide capture process can help to estimate the best process conditions.Keywords: Absorption, carbon dioxide capture, desorption, process simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31965454 Performance Analysis of Software Reliability Models using Matrix Method
Authors: RajPal Garg, Kapil Sharma, Rajive Kumar, R. K. Garg
Abstract:
This paper presents a computational methodology based on matrix operations for a computer based solution to the problem of performance analysis of software reliability models (SRMs). A set of seven comparison criteria have been formulated to rank various non-homogenous Poisson process software reliability models proposed during the past 30 years to estimate software reliability measures such as the number of remaining faults, software failure rate, and software reliability. Selection of optimal SRM for use in a particular case has been an area of interest for researchers in the field of software reliability. Tools and techniques for software reliability model selection found in the literature cannot be used with high level of confidence as they use a limited number of model selection criteria. A real data set of middle size software project from published papers has been used for demonstration of matrix method. The result of this study will be a ranking of SRMs based on the Permanent value of the criteria matrix formed for each model based on the comparison criteria. The software reliability model with highest value of the Permanent is ranked at number – 1 and so on.Keywords: Matrix method, Model ranking, Model selection, Model selection criteria, Software reliability models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23195453 Fuzzy Estimation of Parameters in Statistical Models
Authors: A. Falsafain, S. M. Taheri, M. Mashinchi
Abstract:
Using a set of confidence intervals, we develop a common approach, to construct a fuzzy set as an estimator for unknown parameters in statistical models. We investigate a method to derive the explicit and unique membership function of such fuzzy estimators. The proposed method has been used to derive the fuzzy estimators of the parameters of a Normal distribution and some functions of parameters of two Normal distributions, as well as the parameters of the Exponential and Poisson distributions.Keywords: Confidence interval. Fuzzy number. Fuzzy estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 22735452 Methods for Business Process Simulation Based on Petri Nets
Authors: K. Shoylekova, K. Grigorova
Abstract:
The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated. Two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.
Keywords: Business process repository, Petri nets, Simulation, Woflan, Yasper.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2066