Search results for: background count
656 Numbers and Biomass of Bacteria and Fungi Obtained by the Direct Microscopic Count Method
Authors: Ayuko Itsuki, Sachiyo Aburatani
Abstract:
The soil ecology of the organic and mineral soil layers of laurel-leaved and Cryptomeria japonica forest in the Kasuga-yama Hill Primeval Forest (Nara, Japan) was assessed. The number of bacteria obtained by the dilution plate count method was less than 0.05% of those counted by the direct microscopic count. We therefore found that forest soil contains large numbers of non-culturable bacteria compared with agricultural soils. The numbers of bacteria and fungi obtained by both the dilution plate count and the direct microscopic count were larger in the deeper horizons (F and H) of the organic layer than in the mineral soil layer. This suggests that active microbial metabolism takes place in the organic layer. The numbers of bacteria and the length of fungal hyphae obtained by the direct count method were greater in the H horizon than in the F horizon. The direct microscopic count revealed numerous non-culturable bacteria and fungi in the soil. The ratio of fungal to bacterial biomass was lower in the laurel-leaved forest soil. The fungal biomass was therefore relatively low in the laurel-leaved forest soil due to differences in forest vegetation.Keywords: Bacterial number, Dilution plate count, Direct microscopic count, Forest soil.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3770655 Spatial Econometric Approaches for Count Data: An Overview and New Directions
Authors: Paula Simões, Isabel Natário
Abstract:
This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.Keywords: Spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2704654 Design, Construction and Performance Evaluation of a HPGe Detector Shield
Authors: M. Sharifi, M. Mirzaii, F. Bolourinovin, H. Yousefnia, M. Akbari, K. Yousefi-Mojir
Abstract:
A multilayer passive shield composed of low-activity lead (Pb), copper (Cu), tin (Sn) and iron (Fe) was designed and manufactured for a coaxial HPGe detector placed at a surface laboratory for reducing background radiation and radiation dose to the personnel. The performance of the shield was evaluated and efficiency curves of the detector were plotted by using of various standard sources in different distances. Monte Carlo simulations and a set of TLD chips were used for dose estimation in two distances of 20 and 40 cm. The results show that the shield reduced background spectrum and the personnel dose more than 95%.Keywords: HPGe shield, background count, personnel dose, efficiency curve.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2936653 Reducing Test Vectors Count Using Fault Based Optimization Schemes in VLSI Testing
Authors: Vinod Kumar Khera, R. K. Sharma, A. K. Gupta
Abstract:
Power dissipation increases exponentially during test mode as compared to normal operation of the circuit. In extreme cases, test power is more than twice the power consumed during normal operation mode. Test vector generation scheme is key component in deciding the power hungriness of a circuit during testing. Test vector count and consequent leakage current are functions of test vector generation scheme. Fault based test vector count optimization has been presented in this work. It helps in reducing test vector count and the leakage current. In the presented scheme, test vectors have been reduced by extracting essential child vectors. The scheme has been tested experimentally using stuck at fault models and results ensure the reduction in test vector count.Keywords: Low power VLSI testing, independent fault, essential faults, test vector reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1424652 Zero Inflated Strict Arcsine Regression Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
Zero inflated strict arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, we extend zero inflated strict arcsine model to zero inflated strict arcsine regression model by taking into consideration the extra variability caused by extra zeros and covariates in count data. Maximum likelihood estimation method is used in estimating the parameters for this zero inflated strict arcsine regression model.Keywords: Overdispersed count data, maximum likelihood estimation, simulated annealing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755651 Statistical Analysis for Overdispersed Medical Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
Many researchers have suggested the use of zero inflated Poisson (ZIP) and zero inflated negative binomial (ZINB) models in modeling overdispersed medical count data with extra variations caused by extra zeros and unobserved heterogeneity. The studies indicate that ZIP and ZINB always provide better fit than using the normal Poisson and negative binomial models in modeling overdispersed medical count data. In this study, we proposed the use of Zero Inflated Inverse Trinomial (ZIIT), Zero Inflated Poisson Inverse Gaussian (ZIPIG) and zero inflated strict arcsine models in modeling overdispered medical count data. These proposed models are not widely used by many researchers especially in the medical field. The results show that these three suggested models can serve as alternative models in modeling overdispersed medical count data. This is supported by the application of these suggested models to a real life medical data set. Inverse trinomial, Poisson inverse Gaussian and strict arcsine are discrete distributions with cubic variance function of mean. Therefore, ZIIT, ZIPIG and ZISA are able to accommodate data with excess zeros and very heavy tailed. They are recommended to be used in modeling overdispersed medical count data when ZIP and ZINB are inadequate.
Keywords: Zero inflated, inverse trinomial distribution, Poisson inverse Gaussian distribution, strict arcsine distribution, Pearson’s goodness of fit.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3315650 Zero Inflated Models for Overdispersed Count Data
Authors: Y. N. Phang, E. F. Loh
Abstract:
The zero inflated models are usually used in modeling count data with excess zeros where the existence of the excess zeros could be structural zeros or zeros which occur by chance. These type of data are commonly found in various disciplines such as finance, insurance, biomedical, econometrical, ecology, and health sciences which involve sex and health dental epidemiology. The most popular zero inflated models used by many researchers are zero inflated Poisson and zero inflated negative binomial models. In addition, zero inflated generalized Poisson and zero inflated double Poisson models are also discussed and found in some literature. Recently zero inflated inverse trinomial model and zero inflated strict arcsine models are advocated and proven to serve as alternative models in modeling overdispersed count data caused by excessive zeros and unobserved heterogeneity. The purpose of this paper is to review some related literature and provide a variety of examples from different disciplines in the application of zero inflated models. Different model selection methods used in model comparison are discussed.
Keywords: Overdispersed count data, model selection methods, likelihood ratio, AIC, BIC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4532649 Zero Truncated Strict Arcsine Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
The zero truncated model is usually used in modeling count data without zero. It is the opposite of zero inflated model. Zero truncated Poisson and zero truncated negative binomial models are discussed and used by some researchers in analyzing the abundance of rare species and hospital stay. Zero truncated models are used as the base in developing hurdle models. In this study, we developed a new model, the zero truncated strict arcsine model, which can be used as an alternative model in modeling count data without zero and with extra variation. Two simulated and one real life data sets are used and fitted into this developed model. The results show that the model provides a good fit to the data. Maximum likelihood estimation method is used in estimating the parameters.
Keywords: Hurdle models, maximum likelihood estimation method, positive count data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1857648 Effects of Dry Period Length on, Milk Production and Composition, Blood Metabolites and Complete Blood Count in Subsequent Lactation of Holstein Dairy Cows
Authors: Akbar Soleimani, Alireza Heravi Moussavi, Mohsen Danesh Mesgaran, Abolqasem Golian
Abstract:
Twenty - nine Holstein cows were used to evaluate the effects of different dry period (DP) lengths on milk yield and composition, some blood metabolites, and complete blood count (CBC). Cows were assigned to one of 2 treatments: 1) 60-d dry period, 2) 35-d DP. Milk yield, from calving to 60 days, was not different for cows on the treatments (p =0.130). Cows in the 35-d DP produced more milk protein and SNF compare with cows in treatment 1 (p ≤ 0.05). Serum glucose, non-esterified fatty acids (NEFA), beta hydroxyl butyrate acid (BHBA), blood urea nitrogen (BUN), urea, and glutamic oxaloacetic transaminase (GOT) were all similar among the treatments. Body condition score (BCS), body weight (BW), complete blood count (CBC) and health problems were similar between the treatments. The results of this study demonstrated we can reduce the dry period length to 35 days with no problems.
Keywords: complete blood count, dairy cows, dry period, milk yield
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1935647 Density of Hydrocarbonoclastic Bacteria and Polycyclic Aromatic Hydrocarbon Accumulation in Iko River Mangrove Ecosystem, Nigeria
Authors: Ime R. Udotong, Samuel I. Eduok, Joseph P. Essien, Basil N. Ita
Abstract:
Sediment and mangrove root samples from Iko River Estuary, Nigeria were analyzed for microbial and polycyclic aromatic hydrocarbon (PAH) content. The total heterotrophic bacterial (THB) count ranged from 1.1x107 to 5.1 x107 cfu/g, total fungal (TF) count ranged from 1.0x106 to 2.7x106 cfu/g, total coliform (TC) count ranged from 2.0x104 to 8.0x104cfu/g while hydrocarbon utilizing bacterial (HUB) count ranged from 1.0x 105 to 5.0 x 105cfu/g. There was a range of positive correlation (r = 0.72 to 0.93) between THB count and total HUB count, respectively. The organisms were Staphylococcus aureus, Bacillus cereus, Flavobacterium breve, Pseudomonas aeruginosa, Erwinia amylovora, Escherichia coli, Enterobacter sp, Desulfovibrio sp, Acinetobacter iwoffii, Chromobacterium violaceum, Micrococcus sedentarius, Corynebacterium sp, and Pseudomonas putrefaciens. The PAH were Naphthalene, 2-Methylnaphthalene, Acenapthylene, Acenaphthene, Fluorene, Phenanthene, Anthracene, Fluoranthene, Pyrene, Benzo(a)anthracene, Chrysene, Benzo(b)fluoranthene, Benzo(k)fluoranthene, Benzo(a)pyrene, Dibenzo(a,h)anthracene, Benzo(g,h,l)perylene ,Indeno(1,2,3-d)pyrene with individual PAH concentrations that ranged from 0.20mg/kg to 1.02mg/kg, 0.20mg/kg to 1.07mg/kg and 0.2mg/kg to 4.43mg/kg in the benthic sediment, epipellic sediment and mangrove roots, respectively. Total PAH ranged from 6.30 to 9.93mg/kg, 6.30 to 9.13mg/kg and 9.66 to 16.68mg/kg in the benthic sediment, epipellic sediment and mangrove roots, respectively. The high concentrations in the mangrove roots are indicative of bioaccumulation of the pollutant in the plant tissue. The microorganisms are of ecological significance and the detectable quantities of polycyclic aromatic hydrocarbon could be partitioned and accumulated in tissues of infaunal and epifaunal organisms in the study area.Keywords: Hydrocarbonoclastic bacteria, Iko River estuary, Mangrove, Polycyclic aromatic hydrocarbon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2735646 Analyzing the Factors Effecting the Passenger Car Breakdowns using Com-Poisson GLM
Authors: N. Mamode Khan, V. Jowaheer
Abstract:
Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observations as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use quasi-likelihood estimation approach to estimate the parameters of the model. Under-dispersion parameter is estimated to be 2.14 justifying the appropriateness of Com-Poisson distribution in modelling under-dispersed count responses recorded in this study.
Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, quasi-likelihood estimation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1543645 Dynamic Background Updating for Lightweight Moving Object Detection
Authors: Kelemewerk Destalem, Jungjae Cho, Jaeseong Lee, Ju H. Park, Joonhyuk Yoo
Abstract:
Background subtraction and temporal difference are often used for moving object detection in video. Both approaches are computationally simple and easy to be deployed in real-time image processing. However, while the background subtraction is highly sensitive to dynamic background and illumination changes, the temporal difference approach is poor at extracting relevant pixels of the moving object and at detecting the stopped or slowly moving objects in the scene. In this paper, we propose a simple moving object detection scheme based on adaptive background subtraction and temporal difference exploiting dynamic background updates. The proposed technique consists of histogram equalization, a linear combination of background and temporal difference, followed by the novel frame-based and pixel-based background updating techniques. Finally, morphological operations are applied to the output images. Experimental results show that the proposed algorithm can solve the drawbacks of both background subtraction and temporal difference methods and can provide better performance than that of each method.Keywords: Background subtraction, background updating, real time and lightweight algorithm, temporal difference.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2565644 The Evaluation of Complete Blood Cell Count-Based Inflammatory Markers in Pediatric Obesity and Metabolic Syndrome
Authors: Mustafa M. Donma, Orkide Donma
Abstract:
Obesity is defined as a severe chronic disease characterized by a low-grade inflammatory state. Therefore, inflammatory markers gained utmost importance during the evaluation of obesity and metabolic syndrome (MetS), a disease characterized by central obesity, elevated blood pressure, increased fasting blood glucose and elevated triglycerides or reduced high density lipoprotein cholesterol (HDL-C) values. Some inflammatory markers based upon complete blood cell count (CBC) are available. In this study, it was questioned which inflammatory marker was the best to evaluate the differences between various obesity groups. 514 pediatric individuals were recruited. 132 children with MetS, 155 morbid obese (MO), 90 obese (OB), 38 overweight (OW) and 99 children with normal BMI (N-BMI) were included into the scope of this study. Obesity groups were constituted using age- and sex-dependent body mass index (BMI) percentiles tabulated by World Health Organization. MetS components were determined to be able to specify children with MetS. CBC were determined using automated hematology analyzer. HDL-C analysis was performed. Using CBC parameters and HDL-C values, ratio markers of inflammation, which cover neutrophil-to-lymphocyte ratio (NLR), derived neutrophil-to-lymphocyte ratio (dNLR), platelet-to-lymphocyte ratio (PLR), lymphocyte-to-monocyte ratio (LMR), monocyte-to-HDL-C ratio (MHR) were calculated. Statistical analyses were performed. The statistical significance degree was considered as p < 0.05. There was no statistically significant difference among the groups in terms of platelet count, neutrophil count, lymphocyte count, monocyte count, and NLR. PLR differed significantly between OW and N-BMI as well as MetS. Monocyte-to HDL-C value exhibited statistical significance between MetS and N-BMI, OB, and MO groups. HDL-C value differed between MetS and N-BMI, OW, OB, MO groups. MHR was the ratio, which exhibits the best performance among the other CBC-based inflammatory markers. On the other hand, when MHR was compared to HDL-C only, it was suggested that HDL-C has given much more valuable information. Therefore, this parameter still keeps its value from the diagnostic point of view. Our results suggest that MHR can be an inflammatory marker during the evaluation of pediatric MetS, but the predictive value of this parameter was not superior to HDL-C during the evaluation of obesity.
Keywords: Children, complete blood cell count, high density lipoprotein cholesterol, metabolic syndrome, obesity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 850643 Moving Vehicles Detection Using Automatic Background Extraction
Authors: Saad M. Al-Garni, Adel A. Abdennour
Abstract:
Vehicle detection is the critical step for highway monitoring. In this paper we propose background subtraction and edge detection technique for vehicle detection. This technique uses the advantages of both approaches. The practical applications approved the effectiveness of this method. This method consists of two procedures: First, automatic background extraction procedure, in which the background is extracted automatically from the successive frames; Second vehicles detection procedure, which depend on edge detection and background subtraction. Experimental results show the effective application of this algorithm. Vehicles detection rate was higher than 91%.
Keywords: Image processing, Automatic background extraction, Moving vehicle detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2423642 The Effect of Goat Milk Fractions Supplementation on Serum IgE Response and Leukocytes Count in Dinitrochlorobenzene Sensitized Rat
Authors: Nurliyani, E. Harmayani, MHNE. Soesatyo
Abstract:
In Indonesia, goat milk is often consumed and believed as anti-allergy. The objective of this research was to study the effect of goat milk and their fractions (casein and whey) supplementation on total serum IgE concentrations and leukocytes count in rat sensitized with contact allergen dinitrochlorobenzene (DNCB). Female Wistar rats 6-8 weeks old were divided into four groups: 1) whey, 2) casein, 3) whole milk supplementation and 4) phosphate-buffered saline/PBS (control). The results showed that supplementation of goat milk on rats did not affects on total serum IgE concentrations and number of leukocytes. After sensitized with DNCB, the monocyte percentage in rats was higher (P<0.01) than before. In conclusion, goat milk or their fractions supplementation unable to decrease the total serum IgE concentrations and also had no effect on leukocytes count. However, 1% DNCB could increase the number of monocytes, but could not induce the IgE response.Keywords: Dinitrochlorobenzene, Goat Milk Fractions, IgE, Leukocytes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1584641 Effect of Zidovudine on Hematological and Virologic Parameters among Female Sex Workers Receiving Antiretroviral Therapy (ART) in North – Western Nigeria
Authors: N. M. Sani, E. D. Jatau, O. S. Olonitola, M. Y. Gwarzo, P. Moodley, N. S. Mujahid
Abstract:
Hemoglobin (HB) indicates anemia level and by extension may reflect the nutritional level and perhaps the immunity of an individual. Some antiretroviral drugs like Zidovudine are known to cause anemia in people living with HIV/AIDS (PLWHA). A cross sectional study using demographic data and blood specimen from 218 female commercial sex workers attending antiretroviral therapy (ART) clinics was conducted between December, 2009 and July, 2011 to assess the effect of zidovudine on hematologic, and RNA viral load of female sex workers receiving antiretroviral treatment in north western Nigeria. Anemia is a common and serious complication of both HIV infection and its treatment. In the setting of HIV infection, anemia has been associated with decreased quality of life, functional status, and survival. Antiretroviral therapy, particularly the highly active antiretroviral therapy (HAART), has been associated with a decrease in the incidence and severity of anemia in HIV-infected patients who have received a HAART regimen for at least 1 year. In this study, result has shown that of the 218 patients, 26 with hemoglobin count between 5.1 – 10g/dl were observed to have the highest viral load count of 300,000 – 350,000copies/ml. It was also observed that most patients (190) with HB of 10.1 – 15.0g/dl had viral load count of 200,000 – 250,000 copies /ml. An inverse relationship therefore exists i.e. the lower the hemoglobin level, the higher the viral load count even though the test statistics did not show any significance between the two (P = 0.206). This shows that multivariate logistic regression analysis demonstrated that anemia was associated with a CD4 + cell count below 50/μL, female sex workers with a viral load above 100,000 copies/mL, who use zidovudine. Severe anemia was less prevalent in this study population than in historical comparators; however, mild to moderate anemia rates remain high. The study therefore recommends that hematological and virologic parameters be monitored closely in patients receiving first line ART regimen.Keywords: Female sex worker, Zidovudine, Hemoglobin, Anemia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1763640 Bootstrap Confidence Intervals and Parameter Estimation for Zero Inflated Strict Arcsine Model
Authors: Y. N. Phang, E. F. Loh
Abstract:
Zero inflated Strict Arcsine model is a newly developed model which is found to be appropriate in modeling overdispersed count data. In this study, maximum likelihood estimation method is used in estimating the parameters for zero inflated strict arcsine model. Bootstrapping is then employed to compute the confidence intervals for the estimated parameters.
Keywords: overdispersed count data, maximum likelihood estimation, simulated annealing, BCa confidence intervals.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281639 A Recognition Method for Spatio-Temporal Background in Korean Historical Novels
Authors: Seo-Hee Kim, Kee-Won Kim, Seung-Hoon Kim
Abstract:
The most important elements of a novel are the characters, events and background. The background represents the time, place and situation that character appears, and conveys event and atmosphere more realistically. If readers have the proper knowledge about background of novels, it may be helpful for understanding the atmosphere of a novel and choosing a novel that readers want to read. In this paper, we are targeting Korean historical novels because spatio-temporal background especially performs an important role in historical novels among the genre of Korean novels. To the best of our knowledge, we could not find previous study that was aimed at Korean novels. In this paper, we build a Korean historical national dictionary. Our dictionary has historical places and temple names of kings over many generations as well as currently existing spatial words or temporal words in Korean history. We also present a method for recognizing spatio-temporal background based on patterns of phrasal words in Korean sentences. Our rules utilize postposition for spatial background recognition and temple names for temporal background recognition. The knowledge of the recognized background can help readers to understand the flow of events and atmosphere, and can use to visualize the elements of novels.
Keywords: Data mining, Korean historical novels, Korean linguistic feature, spatio-temporal background.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1123638 Effect of Gating Sprue Height on Mechanical Properties of Thin Wall Ductile Iron
Authors: E. F. Ochulor, S. O. Adeosun, S. A. Balogun
Abstract:
Effect of sprue/metal head height on mould filling, microstructure and mechanical properties of TWDI casting is studied. Results show that metal/sprue height of 50 mm is not sufficient to push the melt through the gating channel, but as it is increased from 100-350 mm, proper mould filling is achieved. However at higher heights between 200 mm and 350 mm, defects associated with incomplete solidification, carbide precipitation and turbulent flow are evident. This research shows that superior UTS, hardness, nodularity and nodule count are obtained at 100 mm sprue height.Keywords: Melt pressure and velocity, nodularity, nodule count, sprue height.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2902637 Plug and Play Interferometer Configuration using Single Modulator Technique
Authors: Norshamsuri Ali, Hafizulfika, Salim Ali Al-Kathiri, Abdulla Al-Attas, Suhairi Saharudin, Mohamed Ridza Wahiddin
Abstract:
We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.Keywords: single photon, interferometer, quantum key distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620636 Video Matting based on Background Estimation
Authors: J.-H. Moon, D.-O Kim, R.-H. Park
Abstract:
This paper presents a video matting method, which extracts the foreground and alpha matte from a video sequence. The objective of video matting is finding the foreground and compositing it with the background that is different from the one in the original image. By finding the motion vectors (MVs) using a sliced block matching algorithm (SBMA), we can extract moving regions from the video sequence under the assumption that the foreground is moving and the background is stationary. In practice, foreground areas are not moving through all frames in an image sequence, thus we accumulate moving regions through the image sequence. The boundaries of moving regions are found by Canny edge detector and the foreground region is separated in each frame of the sequence. Remaining regions are defined as background regions. Extracted backgrounds in each frame are combined and reframed as an integrated single background. Based on the estimated background, we compute the frame difference (FD) of each frame. Regions with the FD larger than the threshold are defined as foreground regions, boundaries of foreground regions are defined as unknown regions and the rest of regions are defined as backgrounds. Segmentation information that classifies an image into foreground, background, and unknown regions is called a trimap. Matting process can extract an alpha matte in the unknown region using pixel information in foreground and background regions, and estimate the values of foreground and background pixels in unknown regions. The proposed video matting approach is adaptive and convenient to extract a foreground automatically and to composite a foreground with a background that is different from the original background.
Keywords: Background estimation, Object segmentation, Blockmatching algorithm, Video matting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813635 Fast Algorithm of Infrared Point Target Detection in Fluctuant Background
Authors: Yang Weiping, Zhang Zhilong, Li Jicheng, Chen Zengping, He Jun
Abstract:
The background estimation approach using a small window median filter is presented on the bases of analyzing IR point target, noise and clutter model. After simplifying the two-dimensional filter, a simple method of adopting one-dimensional median filter is illustrated to make estimations of background according to the characteristics of IR scanning system. The adaptive threshold is used to segment canceled image in the background. Experimental results show that the algorithm achieved good performance and satisfy the requirement of big size image-s real-time processing.Keywords: Point target, background estimation, median filter, adaptive threshold, target detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1843634 Destination Port Detection for Vessels: An Analytic Tool for Optimizing Port Authorities Resources
Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin
Abstract:
Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages Automatic Identification System (AIS) messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring AIS messages. Our RRo method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measures to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Frechet Distance (DFD), Dynamic Time ´ Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an f-measure of 99.08% using Dynamic Time Warping (DTW) similarity measure.
Keywords: Spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 695633 A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data
Authors: N. Mamode Khan, V. Jowaheer
Abstract:
In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.
Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, marginal quasi-likelihood estimation, joint quasi-likelihood estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1468632 Scholar Index for Research Performance Evaluation Using Multiple Criteria Decision Making Analysis
Authors: C. Ardil
Abstract:
This paper aims to present an objective quantitative methodology on how to evaluate individual’s scholarly research output using multiple criteria decision analysis. A multiple criteria decision making analysis (MCDMA) methodological process is adopted to build a multiple criteria evaluation model. With the introduction of the scholar index, which gives significant information about a researcher's productivity and the scholarly impact of his or her publications in a single number (s is the number of publications with at least s citations); cumulative research citation index; the scholar index is included in the citation databases to cover the multidimensional complexity of scholarly research performance and to undertake objective evaluations with scholar index. The scholar index, one of publication activity indexes, is analyzed by considering it to be the most appropriate sciencemetric indicator which allows to smooth over many drawbacks of scholarly output assessment by mere calculation of the number of publications (quantity) and citations (quality). Hence, this study includes a set of indicators-based scholar index to be used for evaluating scholarly researchers. Google Scholar open science database was used to assess and discuss scholarly productivity and impact of researchers. Based on the experiment of computing the scholar index, and its derivative indexes for a set of researchers on open research database platform, quantitative methods of assessing scholarly research output were successfully considered to rank researchers. The proposed methodology considers the ranking, and the selection of data on which a scholarly research performance evaluation was based, the analysis of the data, and the presentation of the multiple criteria analysis results.
Keywords: Multiple Criteria Decision Making Analysis, MCDMA, Research Performance Evaluation, Scholar Index, h index, Science Citation Index, Science Efficiency, Cumulative Citation Index, Sciencemetrics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 474631 A Markov Chain Approximation for ATS Modeling for the Variable Sampling Interval CCC Control Charts
Authors: Y. K. Chen, K. C. Chiou, C. Y. Chen
Abstract:
The cumulative conformance count (CCC) charts are widespread in process monitoring of high-yield manufacturing. Recently, it is found the use of variable sampling interval (VSI) scheme could further enhance the efficiency of the standard CCC charts. The average time to signal (ATS) a shift in defect rate has become traditional measure of efficiency of a chart with the VSI scheme. Determining the ATS is frequently a difficult and tedious task. A simple method based on a finite Markov Chain approach for modeling the ATS is developed. In addition, numerical results are given.Keywords: Cumulative conformance count, variable sampling interval, Markov Chain, average time to signal, control chart.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523630 Military Fighter Aircraft Selection Using Multiplicative Multiple Criteria Decision Making Analysis Method
Authors: C. Ardil
Abstract:
Multiplicative multiple criteria decision making analysis (MCDMA) method is a systematic decision support system to aid decision makers reach appropriate decisions. The application of multiplicative MCDMA in the military aircraft selection problem is significant for proper decision making process, which is the decisive factor in minimizing expenditures and increasing defense capability and capacity. Nine military fighter aircraft alternatives were evaluated by ten decision criteria to solve the decision making problem. In this study, multiplicative MCDMA model aims to evaluate and select an appropriate military fighter aircraft for the Air Force fleet planning. The ranking results of multiplicative MCDMA model were compared with the ranking results of additive MCDMA, logarithmic MCDMA, and regrettive MCDMA models under the L2 norm data normalization technique to substantiate the robustness of the proposed method. The final ranking results indicate the military fighter aircraft Su-57 as the best available solution.
Keywords: Aircraft Selection, Military Fighter Aircraft Selection, Air Force Fleet Planning, Multiplicative MCDMA, Additive MCDMA, Logarithmic MCDMA, Regrettive MCDMA, Mean Weight, Multiple Criteria Decision Making Analysis, Sensitivity Analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 771629 Motion Detection Techniques Using Optical Flow
Authors: A. A. Shafie, Fadhlan Hafiz, M. H. Ali
Abstract:
Motion detection is very important in image processing. One way of detecting motion is using optical flow. Optical flow cannot be computed locally, since only one independent measurement is available from the image sequence at a point, while the flow velocity has two components. A second constraint is needed. The method used for finding the optical flow in this project is assuming that the apparent velocity of the brightness pattern varies smoothly almost everywhere in the image. This technique is later used in developing software for motion detection which has the capability to carry out four types of motion detection. The motion detection software presented in this project also can highlight motion region, count motion level as well as counting object numbers. Many objects such as vehicles and human from video streams can be recognized by applying optical flow technique.Keywords: Background modeling, Motion detection, Optical flow, Velocity smoothness constant, motion trajectories.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5383628 Semi-automatic Background Detection in Microscopic Images
Authors: Alessandro Bevilacqua, Alessandro Gherardi, Ludovico Carozza, Filippo Piccinini
Abstract:
The last years have seen an increasing use of image analysis techniques in the field of biomedical imaging, in particular in microscopic imaging. The basic step for most of the image analysis techniques relies on a background image free of objects of interest, whether they are cells or histological samples, to perform further analysis, such as segmentation or mosaicing. Commonly, this image consists of an empty field acquired in advance. However, many times achieving an empty field could not be feasible. Or else, this could be different from the background region of the sample really being studied, because of the interaction with the organic matter. At last, it could be expensive, for instance in case of live cell analyses. We propose a non parametric and general purpose approach where the background is built automatically stemming from a sequence of images containing even objects of interest. The amount of area, in each image, free of objects just affects the overall speed to obtain the background. Experiments with different kinds of microscopic images prove the effectiveness of our approach.
Keywords: Microscopy, flat field correction, background estimation, image segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834627 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano
Abstract:
A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.
Keywords: Compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1083