{"title":"Principle Components Updates via Matrix Perturbations","authors":"Aiman Elragig, Hanan Dreiwi, Dung Ly, Idriss Elmabrook","volume":128,"journal":"International Journal of Computer and Information Engineering","pagesStart":968,"pagesEnd":977,"ISSN":"1307-6892","URL":"https:\/\/publications.waset.org\/pdf\/10007847","abstract":"This paper highlights a new approach to look at online
\r\nprinciple components analysis (OPCA). Given a data matrix X ∈ <\/span>
\r\nR,^m x \u0002n we characterise the online updates of its covariance as a
\r\nmatrix perturbation problem. Up to the principle components, it
\r\nturns out that online updates of the batch PCA can be captured
\r\nby symmetric matrix perturbation of the batch covariance matrix.
\r\nWe have shown that as n→ n0 >>\u001d 1, the batch covariance and
\r\nits update become almost similar. Finally, utilize our new setup of
\r\nonline updates to find a bound on the angle distance of the principle
\r\ncomponents of X and its update.","references":"[1] J. Weng, Y. Zhang, W.S. Hwang, Candied covariance-free incremental\r\nprinciple component analysis. IEEE Trans. Pattern Anal. Mach. Intel.\r\nRes 9 (2003) 2287-2320.\r\n[2] Y. Zhang, J. Wang, Convergence Analysis of Complementary candid\r\nincremental principle analysis. Technical Report MSU-CSE-01-23,\r\nDepartment of Computer Science and Engineering, Michigan State\r\nUniversity, East Lansing, MI.\r\n[3] G. Stewart, J. Sun, Matrix Perturbation Theory. New York: Academic\r\nPress,1990\r\n[4] R. A. Horn and C. Johnson. Matrix Analysis. Cambridge University press,\r\n1990.\r\n[5] Fontenla-Romero, scar, et al. Online machine learning.\u201d Efficiency and\r\nScalability Methods for Computational Intellect 27 (2013).\r\n[6] Provatas, Spyridon. An online machine learning algorithm for heat load\r\nforecasting in district heating systems. (2014).\r\n[7] H. Wang, Pi. Daoying, S. Youxian, Online SVM regression\r\nalgorithm-based adaptive inverse control. Neurocomputing 70(4)\r\n(2007) 952-959.\r\n[8] W. Barbakh, C. Fyfe, Online clustering algorithms. International Journal\r\nof Neural Systems 18(03) (2008) 185-194.\r\n[9] H. Cardot, D. Degras, Online Principle Component Analysis in\r\nHigh Dimension: Which Algorithm to Choose?,Technical report\r\narXiv:1511.03688 (2015).\r\n[10] Z. Karnin, E. Liberty, Online PCA with Spectral Bounds, JMLR:\r\nWorkshop and Conference Proceedings 40(2015)1-12.\r\n[11] A. Balsubramani, S. Dasgupta, Y. Freund, The fast convergence of\r\nincremental pca. Advances in Neural Information Processing Systems\r\n26(2013)3174-3182.\r\n[12] H. Zhao, P. Chi, J.T. Kwok, A novel incremental principle components\r\nanalysis and its application for face recognition. IEEE Transactions on\r\nSystems. Man. Cybernetics-Partt B: Cybernetics 36(4)(2016) 873-886.\r\n[13] A. D. Iodice, A. Markos, Low-dimensional tracking of association\r\nstructures in categorical data. Stat. Comput. 25(5)(2015)1009-1022\r\n[14] J. Tang, W. Yu, T, Chai, L. Zhao, Online principle component\r\nanalysis with applications to process modeling. Neurocomputing\r\n82(2012)167-178.\r\n[15] T. Budavari, V. Wild, A. Dobos, C. Yip, Reliable eigenspectra for new\r\ngeneration surveys. Numer. Math., 394(2009)1496-1502\r\n[16] H. Zha, H. D. Simon, On upadating problems in latent sementing\r\nindexing, AIAM J. Sci. Comput,, 21(2)(1999)782-791.\r\n[17] R. A. Fisher, The use of multiple measurements in taxonomic problems,\r\nAnnals of Eugenics, 7 (2) (1936) 179188. [18] J Nie, W Kotlowski, MK Warmuth, Online PCA with optimal regret,\r\nJournal of Machine Learning Research 17(173) (2016) 1-49.\r\n[19] D Garber, E Hazan, T Ma, Online learning of eigenvectors. In\r\nInternational Conference on Machine Learning (2015) 560-56.\r\n[20] A. Allahyar, HS. Yazdi, Online discriminative component analysis\r\nfeature extraction from stream data with domain knowledge, Intelligent\r\nData Analysis 18(5) (2014) 927-951.\r\n[21] M. Herbster, S. Pasteris, M. Pontil, Predicting a switching sequence\r\nof graph labelings, Journal of Machine Learning Research 16 (2015)\r\n2003-2022.\r\n[22] G. H. Golub, C. F. Van Loan, Matrix computations. Johns Hopkins\r\nStudies in the Mathematical Sciences. Johns Hopkins University Press,\r\nBaltimore, MD,2003, fourth, edition.\r\n[23] J. B. Tenenbaum, V. De Silva, J. C. Langford, A global\r\ngeometric framework for nonlinear dimensionality reduction. science,\r\n290(5500)((2000) 2319-2323.\r\n[24] G. E. Hinton, R. R. Salakhutdinov, Reducing the dimensionality of data\r\nwith neural networks. science, 313(5786) (2006) 504-507.\r\n[25] E. Bingham, H. Mannila, Random projection in dimensionality\r\nreduction: applications to image and text data, In Proceedings of the\r\nseventh ACM SIGKDD international conference on Knowledge discovery\r\nand data mining (2001) 245-250.\r\n[26] S. T. Roweis, L. K. Saul, Nonlinear dimensionality reduction by locally\r\nlinear embedding. science, 290(5500) (2000) 2323-2326.\r\n[27] O. Fontenla-Romero, B. Guijarro-Berdinas, D. Martinez-Rego, B.\r\nPerez-Sanchez, D. Peteiro-Barral, Online machine learning. Efficiency and\r\nScalability Methods for Computational Intellect, 27 (2013).\r\n[28] G. H. Golub, Some modified matrix eigenvalue problems. SIAM Review,\r\n15 (1973) 318334.\r\n[29] M. Gu, S. Eisenstat, A stable and efficient algorithm for the rank-one\r\nmodification of the symmetric eigenproblem, SIAM J. Matrix Anal. Appl\r\n15 (1994)12661276.\r\n[30] W. Li, H. Yue, S. Valle-Cervantes, S. Qin, Recursive PCA for adaptive\r\nprocess monitoring, J. Process Control, 10 (2000) 471486\r\n[31] A. Hegde, J. Principe, D. Erdogmus, U. Ozertem, Y. Rao, H. Peddaneni,\r\nPerturbation-based eigenvector updates for on-line principal components\r\nanalysis and canonical correlation analysis, J. VLSI Sign. Process., 45\r\n(2006) 8595.\r\n[32] J. Tang, W. Yu, T. Chai, L. Zhao, On-line principal component\r\nanalysis with application to process modeling. Neurocomputing, 82 (2012)\r\n167178.\r\n[33] T. D. Sanger, Optimal unsupervised learning in single-layer linear\r\nfeedforward neural network, Neural Netw. 2(1989) 459-473.\r\n[34] E. Oja, J. Karhunen, On stochastic approximation of the eigenvectors\r\nand eigenvalues of the expectation of a random matrix. Journal of\r\nmathematical analysis and applications, 106(1) (1985) 69-84.","publisher":"World Academy of Science, Engineering and Technology","index":"Open Science Index 128, 2017"}