Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30075
Self Organizing Mixture Network in Mixture Discriminant Analysis: An Experimental Study

Authors: Nazif Çalış, Murat Erişoğlu, Hamza Erol, Tayfun Servi

Abstract:

In the recent works related with mixture discriminant analysis (MDA), expectation and maximization (EM) algorithm is used to estimate parameters of Gaussian mixtures. But, initial values of EM algorithm affect the final parameters- estimates. Also, when EM algorithm is applied two times, for the same data set, it can be give different results for the estimate of parameters and this affect the classification accuracy of MDA. Forthcoming this problem, we use Self Organizing Mixture Network (SOMN) algorithm to estimate parameters of Gaussians mixtures in MDA that SOMN is more robust when random the initial values of the parameters are used [5]. We show effectiveness of this method on popular simulated waveform datasets and real glass data set.

Keywords: Self Organizing Mixture Network, MixtureDiscriminant Analysis, Waveform Datasets, Glass Identification, Mixture of Multivariate Normal Distributions

Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1061316

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1087

References:


[1] Hastie T. and Tibshirani R. (1996), Discriminant Analysis by Gaussian Mixtures, Journal of the Royal Statistical Society. Series B (Methodological), Vol. 58, No. 1, pp. 155-176
[2] Zohar Halbe, Mayer Aladjem(2005),Model-based mixture discriminant analysisÔÇöan experimentalstudy, Pattern Recognition, 38, 437-440
[3] Bashir S. and Carter E. M. (2005),Robust Reduced Rank Mixture Discriminant Analysis, Communications in Statistics Theory and Methods, 34, 135-145
[4] Bashir S. and Carter E.M. (2005),High breakdown mixture discriminant analysis, Journal of Multivariate Analysis, 93, 102-11
[5] Yin H. and Allinson N. M. (2001),Self-Organizing Mixture Networks for Probability Density Estimation, IEEE Transactions on Neural Networks, Vol. 12, No. 2, pp. 405-411
[6] P. Dempster, N. M. Laird, and D. B. Rubin, "Maximum likelihood from incomplete date via the EM algorithm," J. Roy. Statist. Soc. B, vol.39, pp. 1-38, 1977.
[7] S. Kullback and R. A. Leibler, "On information and sufficiency," Ann. Math. Statist., vol. 22, pp. 79-86, 1951.
[8] H. Robbins and S. Monro, "A stochastic approximation method," Ann. Math. Statist., vol. 22, pp. 400-407, 1951.
[9] Murphy, P. M., Aha, D. W. (1995). UCI Repository of Machine Learning Databases. Irvine, CA: University of California, Dept. of Information and Computer Science.
[10] Breiman, L., Friedman, J., Olshen, R. and Stone, C. (1984), Classification and Regression Trees Belmont; Wadsworth
[11] Xu, L. and Jordan, M. I. (1993). Unsupervised learning by EM algorithm based on finite mixture of Gaussians. In Proc. World Congress Neural Networks, vol. 2, pp. 431-434, Portland, OR, USA.
[12] McLachlan, G.J., Peel, D., Basford, K.E., and Adams, P. (1999). The EMMIX software for the fitting of mixtures of normal and tcomponents. Journal of Statistical Software 4, No. 2.
[13] Yin, H. and Allinson, N. M. (1997). Comparison of a Bayesian SOM with the EM algorithm for Gaussian mixtures. Proc. Workshop on Self- Organising Maps (WSOM'97), pp. 118-123.