Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33090
Random Projections for Dimensionality Reduction in ICA
Authors: Sabrina Gaito, Andrea Greppi, Giuliano Grossi
Abstract:
In this paper we present a technique to speed up ICA based on the idea of reducing the dimensionality of the data set preserving the quality of the results. In particular we refer to FastICA algorithm which uses the Kurtosis as statistical property to be maximized. By performing a particular Johnson-Lindenstrauss like projection of the data set, we find the minimum dimensionality reduction rate ¤ü, defined as the ratio between the size k of the reduced space and the original one d, which guarantees a narrow confidence interval of such estimator with high confidence level. The derived dimensionality reduction rate depends on a system control parameter β easily computed a priori on the basis of the observations only. Extensive simulations have been done on different sets of real world signals. They show that actually the dimensionality reduction is very high, it preserves the quality of the decomposition and impressively speeds up FastICA. On the other hand, a set of signals, on which the estimated reduction rate is greater than 1, exhibits bad decomposition results if reduced, thus validating the reliability of the parameter β. We are confident that our method will lead to a better approach to real time applications.Keywords: Independent Component Analysis, FastICA algorithm, Higher-order statistics, Johnson-Lindenstrauss lemma.
Digital Object Identifier (DOI): doi.org/10.5281/zenodo.1055411
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1889References:
[1] P. Comon, "Independent component analysis - a new concept?" Signal Processing, vol. 36, pp. 287-314, 1994.
[2] C. Jutten and J. Herault, "Blind separation of sources, part i: An adaptive algorithm based on neuromimetic architecture," Signal Processing, vol. 24, pp. 1-10, 1991.
[3] A. Cichocki and S. Amari, Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications, J. W. . Sons, Ed. John Wiley & Sons, 2002.
[4] A. Hyv¨arinen and E. Oja, "A fast fixed-point algorithm for independent component analysis," Neural Computation, vol. 9, pp. 1483-1492, 1997.
[5] W. B. Johnson and J. Lindenstrauss, "Extension of Lipschitz mappings into a Hilbert spaces," Contemporary Mathematics, vol. 26, pp. 189-206, 1984.
[6] S. Amari and A. Cichocki, "Recurrent neural networks for blind separation of sources," in Proceedings of International Symposium on Nonlinear Theory and Applications, vol. I, 1995, pp. 37-42.