Search results for: zero inflated Poisson
68 Determination of Poisson’s Ratio and Elastic Modulus of Compression Textile Materials
Authors: Chongyang Ye, Rong Liu
Abstract:
Compression textiles such as compression stockings (CSs) have been extensively applied for the prevention and treatment of chronic venous insufficiency of lower extremities. The involvement of multiple mechanical factors such as interface pressure, frictional force, and elastic materials make the interactions between lower limb and CSs to be complex. Determination of Poisson’s ratio and elastic moduli of CS materials are critical for constructing finite element (FE) modeling to numerically simulate a complex interactive system of CS and lower limb. In this study, a mixed approach, including an analytic model based on the orthotropic Hooke’s Law and experimental study (uniaxial tension testing and pure shear testing), has been proposed to determine Young’s modulus, Poisson’s ratio, and shear modulus of CS fabrics. The results indicated a linear relationship existing between the stress and strain properties of the studied CS samples under controlled stretch ratios (< 100%). The proposed method and the determined key mechanical properties of elastic orthotropic CS fabrics facilitate FE modeling for analyzing in-depth the effects of compression material design on their resultant biomechanical function in compression therapy.
Keywords: Elastic compression stockings, Young’s modulus, Poisson’s ratio, shear modulus, mechanical analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41167 A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data
Authors: N. Mamode Khan, V. Jowaheer
Abstract:
In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, marginal quasi-likelihood estimation, joint quasi-likelihood estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 146866 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia
Authors: N. A. Samat, S. H. Mohd Imam Ma’arof
Abstract:
Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.
Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 329365 The Non-Stationary BINARMA(1,1) Process with Poisson Innovations: An Application on Accident Data
Authors: Y. Sunecher, N. Mamode Khan, V. Jowaheer
Abstract:
This paper considers the modelling of a non-stationary bivariate integer-valued autoregressive moving average of order one (BINARMA(1,1)) with correlated Poisson innovations. The BINARMA(1,1) model is specified using the binomial thinning operator and by assuming that the cross-correlation between the two series is induced by the innovation terms only. Based on these assumptions, the non-stationary marginal and joint moments of the BINARMA(1,1) are derived iteratively by using some initial stationary moments. As regards to the estimation of parameters of the proposed model, the conditional maximum likelihood (CML) estimation method is derived based on thinning and convolution properties. The forecasting equations of the BINARMA(1,1) model are also derived. A simulation study is also proposed where BINARMA(1,1) count data are generated using a multivariate Poisson R code for the innovation terms. The performance of the BINARMA(1,1) model is then assessed through a simulation experiment and the mean estimates of the model parameters obtained are all efficient, based on their standard errors. The proposed model is then used to analyse a real-life accident data on the motorway in Mauritius, based on some covariates: policemen, daily patrol, speed cameras, traffic lights and roundabouts. The BINARMA(1,1) model is applied on the accident data and the CML estimates clearly indicate a significant impact of the covariates on the number of accidents on the motorway in Mauritius. The forecasting equations also provide reliable one-step ahead forecasts.Keywords: Non-stationary, BINARMA(1, 1) model, Poisson Innovations, CML
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 58764 Analyzing the Factors Influencing Exclusive Breastfeeding Using the Generalized Poisson Regression Model
Authors: Cheika Jahangeer, Naushad Mamode Khan, Maleika Heenaye-Mamode Khan
Abstract:
Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is of fundamental importance because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in developed countries, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we study the factors that influence exclusive breastfeeding and use the Generalized Poisson regression model to analyze the practices of exclusive breastfeeding in Mauritius. We develop two sets of quasi-likelihood equations (QLE)to estimate the parameters.
Keywords: Exclusive breastfeeding, Regression model, Quasilikelihood.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 179963 Analysis of Testing and Operational Software Reliability in SRGM based on NHPP
Authors: S. Thirumurugan, D. R. Prince Williams
Abstract:
Software Reliability is one of the key factors in the software development process. Software Reliability is estimated using reliability models based on Non Homogenous Poisson Process. In most of the literature the Software Reliability is predicted only in testing phase. So it leads to wrong decision-making concept. In this paper, two Software Reliability concepts, testing and operational phase are studied in detail. Using S-Shaped Software Reliability Growth Model (SRGM) and Exponential SRGM, the testing and operational reliability values are obtained. Finally two reliability values are compared and optimal release time is investigated.Keywords: Error Detection Rate, Estimation of Parameters, Instantaneous Failure Rate, Mean Value Function, Non Homogenous Poisson Process (NHPP), Software Reliability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 163362 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model
Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah
Abstract:
Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 281461 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood-based estimating methodology. The joint generalized quasi-likelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill-conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQL-III) that is based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.
Keywords: Longitudinal, Com-Poisson, Ill-conditioned, INAR(1), GLMS, GQL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 177560 A Study on Exclusive Breastfeeding using Over-dispersed Statistical Models
Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan
Abstract:
Breastfeeding is an important concept in the maternal life of a woman. In this paper, we focus on exclusive breastfeeding. Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. This type of breastfeeding is very important during the first six months because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in Mauritius, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we give an overview of exclusive breastfeeding in Mauritius and the factors influencing it. We further analyze the local practices of exclusive breastfeeding using the Generalized Poisson regression model and the negative-binomial model since the data are over-dispersed.
Keywords: Exclusive breast feeding, regression model, generalized poisson, negative binomial.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 159959 Statistical Modeling of Mobile Fading Channels Based on Triply Stochastic Filtered Marked Poisson Point Processes
Authors: Jihad S. Daba, J. P. Dubois
Abstract:
Understanding the statistics of non-isotropic scattering multipath channels that fade randomly with respect to time, frequency, and space in a mobile environment is very crucial for the accurate detection of received signals in wireless and cellular communication systems. In this paper, we derive stochastic models for the probability density function (PDF) of the shift in the carrier frequency caused by the Doppler Effect on the received illuminating signal in the presence of a dominant line of sight. Our derivation is based on a generalized Clarke’s and a two-wave partially developed scattering models, where the statistical distribution of the frequency shift is shown to be consistent with the power spectral density of the Doppler shifted signal.
Keywords: Doppler shift, filtered Poisson process, generalized Clark’s model, non-isotropic scattering, partially developed scattering, Rician distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 83258 Advanced Stochastic Models for Partially Developed Speckle
Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije
Abstract:
Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 174357 Structural Investigation of Na2O–B2O3–SiO2 Glasses Doped with NdF3
Authors: M. S. Gaafar, S. Y. Marzouk
Abstract:
Sodium borosilicate glasses doped with different content of NdF3 mol % have been prepared by rapid quenching method. Ultrasonic velocities (both longitudinal and shear) measurements have been carried out at room temperature and at ultrasonic frequency of 4 MHz. Elastic moduli, Debye temperature, softening temperature and Poisson's ratio have been obtained as a function of NdF3 modifier content. Results showed that the elastic moduli, Debye temperature, softening temperature and Poisson's ratio have very slight change with the change of NdF3 mol % content. Based on FTIR spectroscopy and theoretical (Bond compression) model, quantitative analysis has been carried out in order to obtain more information about the structure of these glasses. The study indicated that the structure of these glasses is mainly composed of SiO4 units with four bridging oxygens (Q4), and with three bridging and one nonbridging oxygens (Q3).Keywords: Borosilicate glasses, ultrasonic velocity, elastic moduli, FTIR spectroscopy, bond compression model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175056 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data
Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone
Abstract:
This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease dataset, the study successfully identified key factors, and the results were consistent with previous studies.
Keywords: Lyme disease, Poisson generalized linear model, Ridge regression, Lasso Regression, elastic net regression.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12055 ABURAS Index: A Statistically Developed Index for Dengue-Transmitting Vector Population Prediction
Authors: Hani M. Aburas
Abstract:
“Dengue" is an African word meaning “bone breaking" because it causes severe joint and muscle pain that feels like bones are breaking. It is an infectious disease mainly transmitted by female mosquito, Aedes aegypti, and causes four serotypes of dengue viruses. In recent years, a dramatic increase in the dengue fever confirmed cases around the equator-s belt has been reported. Several conventional indices have been designed so far to monitor the transmitting vector populations known as House Index (HI), Container Index (CI), Breteau Index (BI). However, none of them describes the adult mosquito population size which is important to direct and guide comprehensive control strategy operations since number of infected people has a direct relationship with the vector density. Therefore, it is crucial to know the population size of the transmitting vector in order to design a suitable and effective control program. In this context, a study is carried out to report a new statistical index, ABURAS Index, using Poisson distribution based on the collection of vector population in Jeddah Governorate, Saudi Arabia.Keywords: Poisson distribution, statistical index, prediction, Aedes aegypti.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 191654 Comparison of Stochastic Point Process Models of Rainfall in Singapore
Abstract:
Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.
Keywords: Rainfall disaggregation, statistical properties, poisson processed, Bartlett-Lewis model, Neyman-Scott model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 228053 Vickers Indentation Simulation of Buffer Layer Thickness Effect for DLC Coated Materials
Authors: Abdul Wasy, Balakrishnan G., Yi Qi Wang, Atta Ur Rehman, Jung Il Song
Abstract:
Vickers indentation is used to measure the hardness of materials. In this study, numerical simulation of Vickers indentation experiment was performed for Diamond like Carbon (DLC) coated materials. DLC coatings were deposited on stainless steel 304 substrates with Chromium buffer layer using RF Magnetron and T-shape Filtered Cathodic Vacuum Arc Dual system The objective of this research is to understand the elastic plastic properties, stress strain distribution, ring and lateral crack growth and propagation, penetration depth of indenter and delamination of coating from substrate with effect of buffer layer thickness. The effect of Poisson-s ratio of DLC coating was also analyzed. Indenter penetration is more in coated materials with thin buffer layer as compared to thicker one, under same conditions. Similarly, the specimens with thinner buffer layer failed quickly due to high residual stress as compared to the coated materials with reasonable thickness of 200nm buffer layer. The simulation results suggested the optimized thickness of 200 nm among the prepared specimens for durable and long service.Keywords: Thin film, buffer layer. Diamond like Carbon, Vickers indentation, Poisson's ratio, Finite element.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 293552 Mathematical Study for Traffic Flow and Traffic Density in Kigali Roads
Authors: Kayijuka Idrissa
Abstract:
This work investigates a mathematical study for traffic flow and traffic density in Kigali city roads and the data collected from the national police of Rwanda in 2012. While working on this topic, some mathematical models were used in order to analyze and compare traffic variables. This work has been carried out on Kigali roads specifically at roundabouts from Kigali Business Center (KBC) to Prince House as our study sites. In this project, we used some mathematical tools to analyze the data collected and to understand the relationship between traffic variables. We applied the Poisson distribution method to analyze and to know the number of accidents occurred in this section of the road which is from KBC to Prince House. The results show that the accidents that occurred in 2012 were at very high rates due to the fact that this section has a very narrow single lane on each side which leads to high congestion of vehicles, and consequently, accidents occur very frequently. Using the data of speeds and densities collected from this section of road, we found that the increment of the density results in a decrement of the speed of the vehicle. At the point where the density is equal to the jam density the speed becomes zero. The approach is promising in capturing sudden changes on flow patterns and is open to be utilized in a series of intelligent management strategies and especially in noncurrent congestion effect detection and control.
Keywords: Statistical methods, Poisson distribution, car moving techniques, traffic flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181851 Very-high-Precision Normalized Eigenfunctions for a Class of Schrödinger Type Equations
Authors: Amna Noreen , Kare Olaussen
Abstract:
We demonstrate that it is possible to compute wave function normalization constants for a class of Schr¨odinger type equations by an algorithm which scales linearly (in the number of eigenfunction evaluations) with the desired precision P in decimals.
Keywords: Eigenvalue problems, bound states, trapezoidal rule, poisson resummation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 285350 Critical Assessment of Scoring Schemes for Protein-Protein Docking Predictions
Authors: Dhananjay C. Joshi, Jung-Hsin Lin
Abstract:
Protein-protein interactions (PPI) play a crucial role in many biological processes such as cell signalling, transcription, translation, replication, signal transduction, and drug targeting, etc. Structural information about protein-protein interaction is essential for understanding the molecular mechanisms of these processes. Structures of protein-protein complexes are still difficult to obtain by biophysical methods such as NMR and X-ray crystallography, and therefore protein-protein docking computation is considered an important approach for understanding protein-protein interactions. However, reliable prediction of the protein-protein complexes is still under way. In the past decades, several grid-based docking algorithms based on the Katchalski-Katzir scoring scheme were developed, e.g., FTDock, ZDOCK, HADDOCK, RosettaDock, HEX, etc. However, the success rate of protein-protein docking prediction is still far from ideal. In this work, we first propose a more practical measure for evaluating the success of protein-protein docking predictions,the rate of first success (RFS), which is similar to the concept of mean first passage time (MFPT). Accordingly, we have assessed the ZDOCK bound and unbound benchmarks 2.0 and 3.0. We also createda new benchmark set for protein-protein docking predictions, in which the complexes have experimentally determined binding affinity data. We performed free energy calculation based on the solution of non-linear Poisson-Boltzmann equation (nlPBE) to improve the binding mode prediction. We used the well-studied thebarnase-barstarsystem to validate the parameters for free energy calculations. Besides,thenlPBE-based free energy calculations were conducted for the badly predicted cases by ZDOCK and ZRANK. We found that direct molecular mechanics energetics cannot be used to discriminate the native binding pose from the decoys.Our results indicate that nlPBE-based calculations appeared to be one of the promising approaches for improving the success rate of binding pose predictions.
Keywords: protein-protein docking, protein-protein interaction, molecular mechanics energetics, Poisson-Boltzmann calculations
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180249 Challenges and Opportunities of E-Procurement in the Construction Industry
Authors: Mansur Hamma-adama, Abdul-Basit Sa’eed Ahmad
Abstract:
Construction Industry is evolving amid the fourth industrial revolution. Transportation, commerce, manufacturing and many other industries ripened the current technological advancement and are striving to utilise every development in the IT sector. The procurement of construction works is known to be very conventional and backward in the adoption of digitalisation. The construction industry's procurement and supply chain are blamed for the most inflated cost of construction projects, mainly attributed to a lack of transparency and trust between the industry stakeholders. This research explores the challenges of e-procurement adoption in the industry and identifies the potential opportunities for its usage. This investigation's data are acquired through interviews, and the data are analysed using qualitative content analysis. This study reveals compounding challenges (i.e., corruption and lack of commitment) that lead to the failure of such efforts in Nigeria and the potential prospects (i.e., transparency and efficiency). This study is essential in developing a more effective and transparent process of procurement so that the Nigerian construction industry is not be left behind in the fast-digitalising markets.
Keywords: Challenges, construction industry, corruption, e-procurement, Nigeria, opportunities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72548 Fuzzy Estimation of Parameters in Statistical Models
Authors: A. Falsafain, S. M. Taheri, M. Mashinchi
Abstract:
Using a set of confidence intervals, we develop a common approach, to construct a fuzzy set as an estimator for unknown parameters in statistical models. We investigate a method to derive the explicit and unique membership function of such fuzzy estimators. The proposed method has been used to derive the fuzzy estimators of the parameters of a Normal distribution and some functions of parameters of two Normal distributions, as well as the parameters of the Exponential and Poisson distributions.Keywords: Confidence interval. Fuzzy number. Fuzzy estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 227047 Mathematical Modeling of the Influence of Hydrothermal Processes in the Water Reservoir
Authors: Alibek Issakhov
Abstract:
In this paper presents the mathematical model of hydrothermal processes in thermal power plant with different wind direction scenarios in the water reservoir, which is solved by the Navier - Stokes and temperature equations for an incompressible fluid in a stratified medium. Numerical algorithm based on the method of splitting by physical parameters. Three dimensional Poisson equation is solved with Fourier method by combination of tridiagonal matrix method (Thomas algorithm).Keywords: thermal power plant, hydrothermal process, large eddy simulation, water reservoir
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164346 Propagation of Nonlinear Surface Waves in Relativistically Degenerate Quantum Plasma Half-Space
Authors: Swarniv Chandra, Parthasona Maji, Basudev Ghosh
Abstract:
The nonlinear self-interaction of an electrostatic surface wave on a semibounded quantum plasma with relativistic degeneracy is investigated by using quantum hydrodynamic (QHD) model and the Poisson’s equation with appropriate boundary conditions. It is shown that a part of the second harmonic generated through self-interaction does not have a true surface wave character but propagates obliquely away from the plasma-vacuum interface into the bulk of plasma.
Keywords: Harmonic Generation, Quantum Plasma, Quantum Hydrodynamic Model, Relativistic Degeneracy, Surface waves.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 226345 Statistical Modeling of Constituents in Ash Evolved From Pulverized Coal Combustion
Authors: Esam Jassim
Abstract:
Industries using conventional fossil fuels have an interest in better understanding the mechanism of particulate formation during combustion since such is responsible for emission of undesired inorganic elements that directly impact the atmospheric pollution level. Fine and ultrafine particulates have tendency to escape the flue gas cleaning devices to the atmosphere. They also preferentially collect on surfaces in power systems resulting in ascending in corrosion inclination, descending in the heat transfer thermal unit, and severe impact on human health. This adverseness manifests particularly in the regions of world where coal is the dominated source of energy for consumption. This study highlights the behavior of calcium transformation as mineral grains verses organically associated inorganic components during pulverized coal combustion. The influence of existing type of calcium on the coarse, fine and ultrafine mode formation mechanisms is also presented. The impact of two sub-bituminous coals on particle size and calcium composition evolution during combustion is to be assessed. Three mixed blends named Blends 1, 2, and 3 are selected according to the ration of coal A to coal B by weight. Calcium percentage in original coal increases as going from Blend 1 to 3. A mathematical model and a new approach of describing constituent distribution are proposed. Analysis of experiments of calcium distribution in ash is also modeled using Poisson distribution. A novel parameter, called elemental index λ, is introduced as a measuring factor of element distribution. Results show that calcium in ash that originally in coal as mineral grains has index of 17, whereas organically associated calcium transformed to fly ash shown to be best described when elemental index λ is 7. As an alkaline-earth element, calcium is considered the fundamental element responsible for boiler deficiency since it is the major player in the mechanism of ash slagging process. The mechanism of particle size distribution and mineral species of ash particles are presented using CCSEM and size-segregated ash characteristics. Conclusions are drawn from the analysis of pulverized coal ash generated from a utility-scale boiler.
Keywords: Calcium transformation, Coal Combustion, Inorganic Element, Poisson distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 195644 Network-Constrained AC Unit Commitment under Uncertainty Using a Bender’s Decomposition Approach
Authors: B. Janani, S. Thiruvenkadam
Abstract:
In this work, the system evaluates the impact of considering a stochastic approach on the day ahead basis Unit Commitment. Comparisons between stochastic and deterministic Unit Commitment solutions are provided. The Unit Commitment model consists in the minimization of the total operation costs considering unit’s technical constraints like ramping rates, minimum up and down time. Load shedding and wind power spilling is acceptable, but at inflated operational costs. The evaluation process consists in the calculation of the optimal unit commitment and in verifying the fulfillment of the considered constraints. For the calculation of the optimal unit commitment, an algorithm based on the Benders Decomposition, namely on the Dual Dynamic Programming, was developed. Two approaches were considered on the construction of stochastic solutions. Data related to wind power outputs from two different operational days are considered on the analysis. Stochastic and deterministic solutions are compared based on the actual measured wind power output at the operational day. Through a technique capability of finding representative wind power scenarios and its probabilities, the system can analyze a more detailed process about the expected final operational cost.
Keywords: Benders’ decomposition, network constrained AC unit commitment, stochastic programming, wind power uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131043 Elasto-Plastic Behavior of Rock during Temperature Drop
Authors: N. Reppas, Y. L. Gui, B. Wetenhall, C. T. Davie, J. Ma
Abstract:
A theoretical constitutive model describing the stress-strain behavior of rock subjected to different confining pressures is presented. A bounding surface plastic model with hardening effects is proposed which includes the effect of temperature drop. The bounding surface is based on a mapping rule and the temperature effect on rock is controlled by Poisson’s ratio. Validation of the results against available experimental data is also presented. The relation of deviatoric stress and axial strain is illustrated at different temperatures to analyze the effect of temperature decrease in terms of stiffness of the material.
Keywords: Bounding surface, cooling of rock, plasticity model, rock deformation, elasto-plastic behavior.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91542 Temperature Effect on the Mechanical Properties of Pd3Rh and PdRh3 Ordered Alloys
Authors: J. Davoodi , J. Moradi
Abstract:
The aim of this research was to calculate the mechanical properties of Pd3Rh and PdRh3 ordered alloys. The molecular dynamics (MD) simulation technique was used to obtain temperature dependence of the energy, the Yong modulus, the shear modulus, the bulk modulus, Poisson-s ratio and the elastic stiffness constants at the isobaric-isothermal (NPT) ensemble in the range of 100-325 K. The interatomic potential energy and force on atoms were calculated by Quantum Sutton-Chen (Q-SC) many body potential. Our MD simulation results show the effect of temperature on the cohesive energy and mechanical properties of Pd3Rh as well as PdRh3 alloys. Our computed results show good agreement with the experimental results where they have been available.Keywords: Pd-Rh alloy; Mechanical properties; Moleculardynamics simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 160041 Transient Analysis of a Single-Server Queue with Fixed-Size Batch Arrivals
Authors: Vitalice K. Oduol, C. Ardil
Abstract:
The transient analysis of a queuing system with fixed-size batch Poisson arrivals and a single server with exponential service times is presented. The focus of the paper is on the use of the functions that arise in the analysis of the transient behaviour of the queuing system. These functions are shown to be a generalization of the modified Bessel functions of the first kind, with the batch size B as the generalizing parameter. Results for the case of single-packet arrivals are obtained first. The similarities between the two families of functions are then used to obtain results for the general case of batch arrival queue with a batch size larger than one.
Keywords: batch arrivals, generalized Bessel functions, queuetransient analysis, time-varying probabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 175240 Sub-Impact Phenomenon of Elasto-Plastic Free-Free Beam during a Strike
Authors: H. Rong, X. C. Yin, J. Yang, Y. N. Shen
Abstract:
Based on Rayleigh beam theory, the sub-impacts of a free-free beam struck horizontally by a round-nosed rigid mass is simulated by the finite difference method and the impact-separation conditions. In order to obtain the sub-impact force, a uniaxial compression elastic-plastic contact model is employed to analyze the local deformation field on contact zone. It is found that the horizontal impact is a complicated process including the elastic plastic sub-impacts in sequence. There are two sub-zones of sub-impact. In addition, it found that the elastic energy of the free-free beam is more suitable for the Poisson collision hypothesis to explain compression and recovery processes.Keywords: beam, sub-impact, elastic-plastic deformation, finite difference method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 185039 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand
Authors: Leila Jafari, Viliam Makis
Abstract:
In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.Keywords: Condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 829