Density Estimation using Generalized Linear Model and a Linear Combination of Gaussians
Authors: Aly Farag, Ayman El-Baz, Refaat Mohamed
Abstract:
In this paper we present a novel approach for density estimation. The proposed approach is based on using the logistic regression model to get initial density estimation for the given empirical density. The empirical data does not exactly follow the logistic regression model, so, there will be a deviation between the empirical density and the density estimated using logistic regression model. This deviation may be positive and/or negative. In this paper we use a linear combination of Gaussian (LCG) with positive and negative components as a model for this deviation. Also, we will use the expectation maximization (EM) algorithm to estimate the parameters of LCG. Experiments on real images demonstrate the accuracy of our approach.
Keywords: Logistic regression model, Expectationmaximization, Segmentation.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1076444
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1739References:
[1] V. N. Vapnik, Statistical Learning Theory. New York: John Wiley and Sons, 1998.
[2] B.W. Silverman, Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.
[3] P. E. Hart, “The Condensed Nearest Neighbor Rule," IEEE Transactions on. Information Theory, IT-14, pp.515-516, 1968.
[4] K. Fukunaga and P. M. Narendra, “A Branch And Bound Algorithm For Computing K-Nearest Neighbors," IEEE Transactions on Computers, C- 24, pp. 750-753, 1975.
[5] K. Fukunaga and R. R. Hayes, “The Reduced Parzen Classifier," IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-11, pp.423-425, 1989.
[6] B. W. Silverman, “Kernel Density Estimation Using The Fast Fourier Transform," Statistical Algorithm, AS176, Applied Statistic, 31, pp.93- 97, 1982.
[7] M. C. Jones and H. W. Lotwick, “A Remark on Algorithm AS 176. Kernel Density Estimation Using the Fast Fourier Transform," Remark, AS R50, Applied Statistic, 33, pp.120-122, 1984.
[8] K. Fukunaga, Introduction to Statistical Pattern Recognition, 2nd edition, Academic Press, New York, 1990.
[9] B. Jeon; Landgrebe, D.A., “Fast Parzen density estimation using clustering-based branch and bound," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16 , Issue: 9 September, 1994 pp. 950 - 954.
[10] M. Girolami and C. He, “Probability Density Estimation from Optimally Condensed Data Samples," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, No. 10, October 2003, pp. 1253 - 1264.
[11] M. I. Schlesinger and V. Hlavac, “Ten Lectures on Statistical and Structural Pattern Recognition" Kluwer Academic: Dordrecht, 2002.
[12] M. I. Schlesinger, “A connection between supervised and unsupervised learning in pattern recognition", Kibernetika, no.2, pp. 81-88, 1968
[In Russian].
[13] N. E. Day, “Estimating the components of mixture of normal distributions", Biometrika, Vol. 56, pp. 463-474, 1969.
[14] A. P. Dempster, N. M. Laird, and D. B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm," Journal of Royal Statistics Society, vol. 39B, no. 1, pp. 1-38, 1977.
[15] C. J. McLachlan, The EM algorithm and extensions, Wiley, New York, 1997.
[16] T. Moon, “The Expectation - Maximization algorithm", IEEE Signal Processing Magazine, no. 11, pp. 47-60, 1996.
[17] R. Redner and H. Walker, “Mixture densities, maximum likelihood and the EM algorithm (review)", SIAM Review, Vol. 26, no. 2, pp. 195-237, 1984.
[18] D.E.Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning, Addison-Wesley, Boston, 1989.
[19] J.W.Lamperti, Probability, J. Wiley and Sons, New York, 1996.