Search results for: filtered Poisson process
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5533

Search results for: filtered Poisson process

5503 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood-based estimating methodology. The joint generalized quasi-likelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill-conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQL-III) that is based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: Longitudinal, Com-Poisson, Ill-conditioned, INAR(1), GLMS, GQL.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
5502 A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data

Authors: N. Mamode Khan, V. Jowaheer

Abstract:

In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.

Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, marginal quasi-likelihood estimation, joint quasi-likelihood estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1434
5501 Effects of Various Wavelet Transforms in Dynamic Analysis of Structures

Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar

Abstract:

Time history dynamic analysis of structures is considered as an exact method while being computationally intensive. Filtration of earthquake strong ground motions applying wavelet transform is an approach towards reduction of computational efforts, particularly in optimization of structures against seismic effects. Wavelet transforms are categorized into continuum and discrete transforms. Since earthquake strong ground motion is a discrete function, the discrete wavelet transform is applied in the present paper. Wavelet transform reduces analysis time by filtration of non-effective frequencies of strong ground motion. Filtration process may be repeated several times while the approximation induces more errors. In this paper, strong ground motion of earthquake has been filtered once applying each wavelet. Strong ground motion of Northridge earthquake is filtered applying various wavelets and dynamic analysis of sampled shear and moment frames is implemented. The error, regarding application of each wavelet, is computed based on comparison of dynamic response of sampled structures with exact responses. Exact responses are computed by dynamic analysis of structures applying non-filtered strong ground motion.

Keywords: Wavelet transform, computational error, computational duration, strong ground motion data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1340
5500 Air Pollution and Respiratory-Related Restricted Activity Days in Tunisia

Authors: Mokhtar Kouki Inès Rekik

Abstract:

This paper focuses on the assessment of the air pollution and morbidity relationship in Tunisia. Air pollution is measured by ozone air concentration and the morbidity is measured by the number of respiratory-related restricted activity days during the 2-week period prior to the interview. Socioeconomic data are also collected in order to adjust for any confounding covariates. Our sample is composed by 407 Tunisian respondents; 44.7% are women, the average age is 35.2, near 69% are living in a house built after 1980, and 27.8% have reported at least one day of respiratory-related restricted activity. The model consists on the regression of the number of respiratory-related restricted activity days on the air quality measure and the socioeconomic covariates. In order to correct for zero-inflation and heterogeneity, we estimate several models (Poisson, negative binomial, zero inflated Poisson, Poisson hurdle, negative binomial hurdle and finite mixture Poisson models). Bootstrapping and post-stratification techniques are used in order to correct for any sample bias. According to the Akaike information criteria, the hurdle negative binomial model has the greatest goodness of fit. The main result indicates that, after adjusting for socioeconomic data, the ozone concentration increases the probability of positive number of restricted activity days.

Keywords: Bootstrapping, hurdle negbin model, overdispersion, ozone concentration, respiratory-related restricted activity days.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2098
5499 Zero Inflated Models for Overdispersed Count Data

Authors: Y. N. Phang, E. F. Loh

Abstract:

The zero inflated models are usually used in modeling count data with excess zeros where the existence of the excess zeros could be structural zeros or zeros which occur by chance. These type of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences which involve sex and health dental epidemiology. The most popular zero inflated models used by many researchers are zero inflated Poisson and zero inflated negative binomial models. In addition, zero inflated generalized Poisson and zero inflated double Poisson models are also discussed and found in some literature. Recently zero inflated inverse trinomial model and zero inflated strict arcsine models are advocated and proven to serve as alternative models in modeling overdispersed count data caused by excessive zeros and unobserved heterogeneity. The purpose of this paper is to review some related literature and provide a variety of examples from different disciplines in the application of zero inflated models. Different model selection methods used in model comparison are discussed.

Keywords: Overdispersed count data, model selection methods, likelihood ratio, AIC, BIC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4483
5498 Dengue Disease Mapping with Standardized Morbidity Ratio and Poisson-gamma Model: An Analysis of Dengue Disease in Perak, Malaysia

Authors: N. A. Samat, S. H. Mohd Imam Ma’arof

Abstract:

Dengue disease is an infectious vector-borne viral disease that is commonly found in tropical and sub-tropical regions, especially in urban and semi-urban areas, around the world and including Malaysia. There is no currently available vaccine or chemotherapy for the prevention or treatment of dengue disease. Therefore prevention and treatment of the disease depend on vector surveillance and control measures. Disease risk mapping has been recognized as an important tool in the prevention and control strategies for diseases. The choice of statistical model used for relative risk estimation is important as a good model will subsequently produce a good disease risk map. Therefore, the aim of this study is to estimate the relative risk for dengue disease based initially on the most common statistic used in disease mapping called Standardized Morbidity Ratio (SMR) and one of the earliest applications of Bayesian methodology called Poisson-gamma model. This paper begins by providing a review of the SMR method, which we then apply to dengue data of Perak, Malaysia. We then fit an extension of the SMR method, which is the Poisson-gamma model. Both results are displayed and compared using graph, tables and maps. Results of the analysis shows that the latter method gives a better relative risk estimates compared with using the SMR. The Poisson-gamma model has been demonstrated can overcome the problem of SMR when there is no observed dengue cases in certain regions. However, covariate adjustment in this model is difficult and there is no possibility for allowing spatial correlation between risks in adjacent areas. The drawbacks of this model have motivated many researchers to propose other alternative methods for estimating the risk.

Keywords: Dengue disease, Disease mapping, Standardized Morbidity Ratio, Poisson-gamma model, Relative risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3242
5497 Parallel Image Compression and Analysis with Wavelets

Authors: M. Kutila, J. Viitanen

Abstract:

This paper presents image compression with wavelet based method. The wavelet transformation divides image to low- and high pass filtered parts. The traditional JPEG compression technique requires lower computation power with feasible losses, when only compression is needed. However, there is obvious need for wavelet based methods in certain circumstances. The methods are intended to the applications in which the image analyzing is done parallel with compression. Furthermore, high frequency bands can be used to detect changes or edges. Wavelets enable hierarchical analysis for low pass filtered sub-images. The first analysis can be done for a small image, and only if any interesting is found, the whole image is processed or reconstructed.

Keywords: image compression, jpeg, wavelet, vlc

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1750
5496 FILMS based ANC System – Evaluation and Practical Implementation

Authors: Branislav Vuksanović, Dragana Nikolić

Abstract:

This paper describes the implementation and testing of a multichannel active noise control system (ANCS) based on the filtered-inverse LMS (FILMS) algorithm. The FILMS algorithm is derived from the well-known filtered-x LMS (FXLMS) algorithm with the aim to improve the rate of convergence of the multichannel FXLMS algorithm and to reduce its computational load. Laboratory setup and techniques used to implement this system efficiently are described in this paper. Experiments performed in order to test the performance of the FILMS algorithm are discussed and the obtained results presented.

Keywords: Active noise control, adaptive filters, inverse filters, LMS algorithm, FILMS algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1600
5495 Analyzing the Factors Influencing Exclusive Breastfeeding Using the Generalized Poisson Regression Model

Authors: Cheika Jahangeer, Naushad Mamode Khan, Maleika Heenaye-Mamode Khan

Abstract:

Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. Exclusive breastfeeding during the first 6 months of life is of fundamental importance because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in developed countries, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we study the factors that influence exclusive breastfeeding and use the Generalized Poisson regression model to analyze the practices of exclusive breastfeeding in Mauritius. We develop two sets of quasi-likelihood equations (QLE)to estimate the parameters.

Keywords: Exclusive breastfeeding, Regression model, Quasilikelihood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1758
5494 Comparison of Stochastic Point Process Models of Rainfall in Singapore

Authors: Y. Lu, X. S. Qin

Abstract:

Extensive rainfall disaggregation approaches have been developed and applied in climate change impact studies such as flood risk assessment and urban storm water management.In this study, five rainfall models that were capable ofdisaggregating daily rainfall data into hourly one were investigated for the rainfall record in theChangi Airport, Singapore. The objectives of this study were (i) to study the temporal characteristics of hourly rainfall in Singapore, and (ii) to evaluate the performance of variousdisaggregation models. The used models included: (i) Rectangular pulse Poisson model (RPPM), (ii) Bartlett-Lewis Rectangular pulse model (BLRPM), (iii) Bartlett-Lewis model with 2 cell types (BL2C), (iv) Bartlett-Lewis Rectangular with cell depth distribution dependent on duration (BLRD), and (v) Neyman-Scott Rectangular pulse model (NSRPM). All of these models werefitted using hourly rainfall data ranging from 1980 to 2005 (which was obtained from Changimeteorological station).The study results indicated that the weight scheme of inversely proportional variance could deliver more accurateoutputs for fitting rainfall patterns in tropical areas, and BLRPM performedrelatively better than other disaggregation models.

Keywords: Rainfall disaggregation, statistical properties, poisson processed, Bartlett-Lewis model, Neyman-Scott model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2257
5493 Modelling Dengue Fever (DF) and Dengue Haemorrhagic Fever (DHF) Outbreak Using Poisson and Negative Binomial Model

Authors: W. Y. Wan Fairos, W. H. Wan Azaki, L. Mohamad Alias, Y. Bee Wah

Abstract:

Dengue fever has become a major concern for health authorities all over the world particularly in the tropical countries. These countries, in particular are experiencing the most worrying outbreak of dengue fever (DF) and dengue haemorrhagic fever (DHF). The DF and DHF epidemics, thus, have become the main causes of hospital admissions and deaths in Malaysia. This paper, therefore, attempts to examine the environmental factors that may influence the recent dengue outbreak. The aim of this study is twofold, firstly is to establish a statistical model to describe the relationship between the number of dengue cases and a range of explanatory variables and secondly, to identify the lag operator for explanatory variables which affect the dengue incidence the most. The explanatory variables involved include the level of cloud cover, percentage of relative humidity, amount of rainfall, maximum temperature, minimum temperature and wind speed. The Poisson and Negative Binomial regression analyses were used in this study. The results of the analyses on the 915 observations (daily data taken from July 2006 to Dec 2008), reveal that the climatic factors comprising of daily temperature and wind speed were found to significantly influence the incidence of dengue fever after 2 and 3 weeks of their occurrences. The effect of humidity, on the other hand, appears to be significant only after 2 weeks.

Keywords: Dengue Fever, Dengue Hemorrhagic Fever, Negative Binomial Regression model, Poisson Regression model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2779
5492 Performance Analysis of 5G for Low Latency Transmission Based on Universal Filtered Multi-Carrier Technique and Interleave Division Multiple Access

Authors: A. Asgharzadeh, M. Maroufi

Abstract:

5G mobile communication system has drawn more and more attention. The 5G system needs to provide three different types of services, including enhanced Mobile BroadBand (eMBB), massive machine-type communication (mMTC), and ultra-reliable and low-latency communication (URLLC). Universal Filtered Multi-Carrier (UFMC), Filter Bank Multicarrier (FBMC), and Filtered Orthogonal Frequency Division Multiplexing (f-OFDM) are suggested as a well-known candidate waveform for the coming 5G system. Themachine-to-machine (M2M) communications are one of the essential applications in 5G, and it involves exchanging of concise messages with a very short latency. However, in UFMC systems, the subcarriers are grouped into subbands but f-OFDM only one subband covers the entire band. Furthermore, in FBMC, a subband includes only one subcarrier, and the number of subbands is the same as the number of subcarriers. This paper mainly discusses the performance of UFMC with different parameters for the UFMC system. Also, paper shows that UFMC is the best choice outperforming OFDM in any case and FBMC in case of very short packets while performing similarly for long sequences with channel estimation techniques for Interleave Division Multiple Access (IDMA) systems.

Keywords: UFMC, IDMA, 5G, subband.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 439
5491 A Study on Exclusive Breastfeeding using Over-dispersed Statistical Models

Authors: Naushad Mamode Khan, Cheika Jahangeer, Maleika Heenaye-Mamode Khan

Abstract:

Breastfeeding is an important concept in the maternal life of a woman. In this paper, we focus on exclusive breastfeeding. Exclusive breastfeeding is the feeding of a baby on no other milk apart from breast milk. This type of breastfeeding is very important during the first six months because it supports optimal growth and development during infancy and reduces the risk of obliterating diseases and problems. Moreover, in Mauritius, exclusive breastfeeding has decreased the incidence and/or severity of diarrhea, lower respiratory infection and urinary tract infection. In this paper, we give an overview of exclusive breastfeeding in Mauritius and the factors influencing it. We further analyze the local practices of exclusive breastfeeding using the Generalized Poisson regression model and the negative-binomial model since the data are over-dispersed.

Keywords: Exclusive breast feeding, regression model, generalized poisson, negative binomial.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1563
5490 Reliability Modeling and Data Analysis of Vacuum Circuit Breaker Subject to Random Shocks

Authors: Rafik Medjoudj, Rabah Medjoudj, D. Aissani

Abstract:

The electrical substation components are often subject to degradation due to over-voltage or over-current, caused by a short circuit or a lightning. A particular interest is given to the circuit breaker, regarding the importance of its function and its dangerous failure. This component degrades gradually due to the use, and it is also subject to the shock process resulted from the stress of isolating the fault when a short circuit occurs in the system. In this paper, based on failure mechanisms developments, the wear out of the circuit breaker contacts is modeled. The aim of this work is to evaluate its reliability and consequently its residual lifetime. The shock process is based on two random variables such as: the arrival of shocks and their magnitudes. The arrival of shocks was modeled using homogeneous Poisson process (HPP). By simulation, the dates of short-circuit arrivals were generated accompanied with their magnitudes. The same principle of simulation is applied to the amount of cumulative wear out contacts. The objective reached is to find the formulation of the wear function depending on the number of solicitations of the circuit breaker.

Keywords: reliability, short-circuit, models of shocks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1910
5489 Optimal Production and Maintenance Policy for a Partially Observable Production System with Stochastic Demand

Authors: Leila Jafari, Viliam Makis

Abstract:

In this paper, the joint optimization of the economic manufacturing quantity (EMQ), safety stock level, and condition-based maintenance (CBM) is presented for a partially observable, deteriorating system subject to random failure. The demand is stochastic and it is described by a Poisson process. The stochastic model is developed and the optimization problem is formulated in the semi-Markov decision process framework. A modification of the policy iteration algorithm is developed to find the optimal policy. A numerical example is presented to compare the optimal policy with the policy considering zero safety stock.

Keywords: Condition-based maintenance, economic manufacturing quantity, safety stock, stochastic demand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 791
5488 Structural Investigation of Na2O–B2O3–SiO2 Glasses Doped with NdF3

Authors: M. S. Gaafar, S. Y. Marzouk

Abstract:

Sodium borosilicate glasses doped with different content of NdF3 mol % have been prepared by rapid quenching method. Ultrasonic velocities (both longitudinal and shear) measurements have been carried out at room temperature and at ultrasonic frequency of 4 MHz. Elastic moduli, Debye temperature, softening temperature and Poisson's ratio have been obtained as a function of NdF3 modifier content. Results showed that the elastic moduli, Debye temperature, softening temperature and Poisson's ratio have very slight change with the change of NdF3 mol % content. Based on FTIR spectroscopy and theoretical (Bond compression) model, quantitative analysis has been carried out in order to obtain more information about the structure of these glasses. The study indicated that the structure of these glasses is mainly composed of SiO4 units with four bridging oxygens (Q4), and with three bridging and one nonbridging oxygens (Q3).

Keywords: Borosilicate glasses, ultrasonic velocity, elastic moduli, FTIR spectroscopy, bond compression model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1720
5487 Optimal Parameters of Double Moving Average Control Chart

Authors: Y. Areepong

Abstract:

The objective of this paper is to present explicit analytical formulas for evaluating important characteristics of Double Moving Average control chart (DMA) for Poisson distribution. The most popular characteristics of a control chart are Average Run Length ( 0 ARL ) - the mean of observations that are taken before a system is signaled to be out-of control when it is actually still incontrol, and Average Delay time ( 1 ARL ) - mean delay of true alarm times. An important property required of 0 ARL is that it should be sufficiently large when the process is in-control to reduce a number of false alarms. On the other side, if the process is actually out-ofcontrol then 1 ARL should be as small as possible. In particular, the explicit analytical formulas for evaluating 0 ARL and 1 ARL be able to get a set of optimal parameters which depend on a width of the moving average ( w ) and width of control limit ( H ) for designing DMA chart with minimum of 1 ARL

Keywords: Optimal parameters, Average Run Length, Average Delay time, Double Moving Average chart.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2297
5486 Mathematical Modeling of the Influence of Hydrothermal Processes in the Water Reservoir

Authors: Alibek Issakhov

Abstract:

In this paper presents the mathematical model of hydrothermal processes in thermal power plant with different wind direction scenarios in the water reservoir, which is solved by the Navier - Stokes and temperature equations for an incompressible fluid in a stratified medium. Numerical algorithm based on the method of splitting by physical parameters. Three dimensional Poisson equation is solved with Fourier method by combination of tridiagonal matrix method (Thomas algorithm).

Keywords: thermal power plant, hydrothermal process, large eddy simulation, water reservoir

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1614
5485 Classification of Earthquake Distribution in the Banda Sea Collision Zone with Point Process Approach

Authors: Henry J. Wattimanela, Udjianna S. Pasaribu, Nanang T. Puspito, Sapto W. Indratno

Abstract:

Banda Sea Collision Zone (BSCZ) is the result of the interaction and convergence of Indo-Australian plate, Eurasian plate and Pacific plate. This location is located in eastern Indonesia. This zone has a very high seismic activity. In this research, we will calculate the rate (λ) and Mean Square Error (MSE). By this result, we will classification earthquakes distribution in the BSCZ with the point process approach. Chi-square is used to determine the type of earthquakes distribution in the sub region of BSCZ. The data used in this research is data of earthquakes with a magnitude ≥ 6 SR for the period 1964-2013 and sourced from BMKG Jakarta. This research is expected to contribute to the Moluccas Province and surrounding local governments in performing spatial plan document related to disaster management.

Keywords: Banda sea collision zone, earthquakes, mean square error, Poisson distribution, chi-square test.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2072
5484 Identifying Factors Contributing to the Spread of Lyme Disease: A Regression Analysis of Virginia’s Data

Authors: Fatemeh Valizadeh Gamchi, Edward L. Boone

Abstract:

This research focuses on Lyme disease, a widespread infectious condition in the United States caused by the bacterium Borrelia burgdorferi sensu stricto. It is critical to identify environmental and economic elements that are contributing to the spread of the disease. This study examined data from Virginia to identify a subset of explanatory variables significant for Lyme disease case numbers. To identify relevant variables and avoid overfitting, linear poisson, and regularization regression methods such as ridge, lasso, and elastic net penalty were employed. Cross-validation was performed to acquire tuning parameters. The methods proposed can automatically identify relevant disease count covariates. The efficacy of the techniques was assessed using four criteria on three simulated datasets. Finally, using the Virginia Department of Health’s Lyme disease dataset, the study successfully identified key factors, and the results were consistent with previous studies.

Keywords: Lyme disease, Poisson generalized linear model, Ridge regression, Lasso Regression, elastic net regression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 68
5483 ABURAS Index: A Statistically Developed Index for Dengue-Transmitting Vector Population Prediction

Authors: Hani M. Aburas

Abstract:

“Dengue" is an African word meaning “bone breaking" because it causes severe joint and muscle pain that feels like bones are breaking. It is an infectious disease mainly transmitted by female mosquito, Aedes aegypti, and causes four serotypes of dengue viruses. In recent years, a dramatic increase in the dengue fever confirmed cases around the equator-s belt has been reported. Several conventional indices have been designed so far to monitor the transmitting vector populations known as House Index (HI), Container Index (CI), Breteau Index (BI). However, none of them describes the adult mosquito population size which is important to direct and guide comprehensive control strategy operations since number of infected people has a direct relationship with the vector density. Therefore, it is crucial to know the population size of the transmitting vector in order to design a suitable and effective control program. In this context, a study is carried out to report a new statistical index, ABURAS Index, using Poisson distribution based on the collection of vector population in Jeddah Governorate, Saudi Arabia.

Keywords: Poisson distribution, statistical index, prediction, Aedes aegypti.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1888
5482 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR datasets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: Filtering, graphics, level-of-details, LiDAR, realtime visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2515
5481 ASLT Method for Beer Accelerated Shelf-Life Determination

Authors: Tatjana Rakcejeva, Valentina Skorina, Daina Karklina, Liga Skudra

Abstract:

The aim of current research was to investigate ASLT method suitability for accelerated beer shelf-life determination. The research was accomplished on popular Latvian beer: light filtrated and unfiltered pasteurized beer with alcohol content 5.2%; dark filtrated pasteurized beer with alcohol content 4.2% with shelf-life five months. Bottled in dark glass bottles beer samples were storage during 20 weeks at several temperature regimes: +10±1 °C, +20±1 °C, +30±1 °C, +40±1 °C. Samples quality parameters as physically-chemical and microbiological was tested every two weeks using standard methods. It is possible to determine beer shelf-life rapidly during storage at +30±1 °C for filtered pasteurized light beer by 2.5 times, unfiltered pasteurized light beer by 1.4 times and for filtered pasteurized dark beer by 1.7 times. During preset experiments it was proved, that it is possible to determine beer shelf-life rapidly using ASLT method if beer storage temperature could be increased by +10±1 °C.

Keywords: Beer, shelf-life, ASLT method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6048
5480 Using Artificial Neural Network to Predict Collisions on Horizontal Tangents of 3D Two-Lane Highways

Authors: Omer F. Cansiz, Said M. Easa

Abstract:

The purpose of this study is mainly to predict collision frequency on the horizontal tangents combined with vertical curves using artificial neural network methods. The proposed ANN models are compared with existing regression models. First, the variables that affect collision frequency were investigated. It was found that only the annual average daily traffic, section length, access density, the rate of vertical curvature, smaller curve radius before and after the tangent were statistically significant according to related combinations. Second, three statistical models (negative binomial, zero inflated Poisson and zero inflated negative binomial) were developed using the significant variables for three alignment combinations. Third, ANN models are developed by applying the same variables for each combination. The results clearly show that the ANN models have the lowest mean square error value than those of the statistical models. Similarly, the AIC values of the ANN models are smaller to those of the regression models for all the combinations. Consequently, the ANN models have better statistical performances than statistical models for estimating collision frequency. The ANN models presented in this paper are recommended for evaluating the safety impacts 3D alignment elements on horizontal tangents.

Keywords: Collision frequency, horizontal tangent, 3D two-lane highway, negative binomial, zero inflated Poisson, artificial neural network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1612
5479 Sub-Impact Phenomenon of Elasto-Plastic Free-Free Beam during a Strike

Authors: H. Rong, X. C. Yin, J. Yang, Y. N. Shen

Abstract:

Based on Rayleigh beam theory, the sub-impacts of a free-free beam struck horizontally by a round-nosed rigid mass is simulated by the finite difference method and the impact-separation conditions. In order to obtain the sub-impact force, a uniaxial compression elastic-plastic contact model is employed to analyze the local deformation field on contact zone. It is found that the horizontal impact is a complicated process including the elastic plastic sub-impacts in sequence. There are two sub-zones of sub-impact. In addition, it found that the elastic energy of the free-free beam is more suitable for the Poisson collision hypothesis to explain compression and recovery processes.

Keywords: beam, sub-impact, elastic-plastic deformation, finite difference method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1818
5478 Mathematical Study for Traffic Flow and Traffic Density in Kigali Roads

Authors: Kayijuka Idrissa

Abstract:

This work investigates a mathematical study for traffic flow and traffic density in Kigali city roads and the data collected from the national police of Rwanda in 2012. While working on this topic, some mathematical models were used in order to analyze and compare traffic variables. This work has been carried out on Kigali roads specifically at roundabouts from Kigali Business Center (KBC) to Prince House as our study sites. In this project, we used some mathematical tools to analyze the data collected and to understand the relationship between traffic variables. We applied the Poisson distribution method to analyze and to know the number of accidents occurred in this section of the road which is from KBC to Prince House. The results show that the accidents that occurred in 2012 were at very high rates due to the fact that this section has a very narrow single lane on each side which leads to high congestion of vehicles, and consequently, accidents occur very frequently. Using the data of speeds and densities collected from this section of road, we found that the increment of the density results in a decrement of the speed of the vehicle. At the point where the density is equal to the jam density the speed becomes zero. The approach is promising in capturing sudden changes on flow patterns and is open to be utilized in a series of intelligent management strategies and especially in noncurrent congestion effect detection and control.

Keywords: Statistical methods, Poisson distribution, car moving techniques, traffic flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1789
5477 Very-high-Precision Normalized Eigenfunctions for a Class of Schrödinger Type Equations

Authors: Amna Noreen , Kare Olaussen

Abstract:

We demonstrate that it is possible to compute wave function normalization constants for a class of Schr¨odinger type equations by an algorithm which scales linearly (in the number of eigenfunction evaluations) with the desired precision P in decimals.

Keywords: Eigenvalue problems, bound states, trapezoidal rule, poisson resummation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2823
5476 Detection of Keypoint in Press-Fit Curve Based on Convolutional Neural Network

Authors: Shoujia Fang, Guoqing Ding, Xin Chen

Abstract:

The quality of press-fit assembly is closely related to reliability and safety of product. The paper proposed a keypoint detection method based on convolutional neural network to improve the accuracy of keypoint detection in press-fit curve. It would provide an auxiliary basis for judging quality of press-fit assembly. The press-fit curve is a curve of press-fit force and displacement. Both force data and distance data are time-series data. Therefore, one-dimensional convolutional neural network is used to process the press-fit curve. After the obtained press-fit data is filtered, the multi-layer one-dimensional convolutional neural network is used to perform the automatic learning of press-fit curve features, and then sent to the multi-layer perceptron to finally output keypoint of the curve. We used the data of press-fit assembly equipment in the actual production process to train CNN model, and we used different data from the same equipment to evaluate the performance of detection. Compared with the existing research result, the performance of detection was significantly improved. This method can provide a reliable basis for the judgment of press-fit quality.

Keywords: Keypoint detection, curve feature, convolutional neural network, press-fit assembly.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 896
5475 Implementation of a Web-Based Wireless ECG Measuring and Recording System

Authors: Onder Yakut, Serdar Solak, Emine Dogru Bolat

Abstract:

Measuring the Electrocardiogram (ECG) signal is an essential process for the diagnosis of the heart diseases. The ECG signal has the information of the degree of how much the heart performs its functions. In medical diagnosis and treatment systems, Decision Support Systems processing the ECG signal are being developed for the use of clinicians while medical examination. In this study, a modular wireless ECG (WECG) measuring and recording system using a single board computer and e-Health sensor platform is developed. In this designed modular system, after the ECG signal is taken from the body surface by the electrodes first, it is filtered and converted to digital form. Then, it is recorded to the health database using Wi-Fi communication technology. The real time access of the ECG data is provided through the internet utilizing the developed web interface.

Keywords: ECG, e-health sensor shield, raspberry Pi, wifi technology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2968
5474 Statistical Modeling of Constituents in Ash Evolved From Pulverized Coal Combustion

Authors: Esam Jassim

Abstract:

Industries using conventional fossil fuels have an  interest in better understanding the mechanism of particulate  formation during combustion since such is responsible for emission  of undesired inorganic elements that directly impact the atmospheric  pollution level. Fine and ultrafine particulates have tendency to  escape the flue gas cleaning devices to the atmosphere. They also  preferentially collect on surfaces in power systems resulting in  ascending in corrosion inclination, descending in the heat transfer  thermal unit, and severe impact on human health. This adverseness  manifests particularly in the regions of world where coal is the  dominated source of energy for consumption.  This study highlights the behavior of calcium transformation as  mineral grains verses organically associated inorganic components  during pulverized coal combustion. The influence of existing type of  calcium on the coarse, fine and ultrafine mode formation mechanisms  is also presented. The impact of two sub-bituminous coals on particle  size and calcium composition evolution during combustion is to be  assessed. Three mixed blends named Blends 1, 2, and 3 are selected  according to the ration of coal A to coal B by weight. Calcium  percentage in original coal increases as going from Blend 1 to 3.  A mathematical model and a new approach of describing  constituent distribution are proposed. Analysis of experiments of  calcium distribution in ash is also modeled using Poisson distribution.  A novel parameter, called elemental index λ, is introduced as a  measuring factor of element distribution.  Results show that calcium in ash that originally in coal as mineral  grains has index of 17, whereas organically associated calcium  transformed to fly ash shown to be best described when elemental  index λ is 7.  As an alkaline-earth element, calcium is considered the  fundamental element responsible for boiler deficiency since it is the  major player in the mechanism of ash slagging process. The  mechanism of particle size distribution and mineral species of ash  particles are presented using CCSEM and size-segregated ash  characteristics. Conclusions are drawn from the analysis of  pulverized coal ash generated from a utility-scale boiler.

 

Keywords: Calcium transformation, Coal Combustion, Inorganic Element, Poisson distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1934