Matrix Completion with Heterogeneous Observation Cost Using Sparsity-Number of Column-Space
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32845
Matrix Completion with Heterogeneous Observation Cost Using Sparsity-Number of Column-Space

Authors: Ilqar Ramazanli

Abstract:

The matrix completion problem has been studied broadly under many underlying conditions. In many real-life scenarios, we could expect elements from distinct columns or distinct positions to have a different cost. In this paper, we explore this generalization under adaptive conditions. We approach the problem under two different cost models. The first one is that entries from different columns have different observation costs, but, within the same column, each entry has a uniform cost. The second one is any two entry has different observation cost, despite being the same or different columns. We provide complexity analysis of our algorithms and provide tightness guarantees.

Keywords: Matrix completion, adaptive learning, heterogeneous cost, Matroid optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 428

References:


[1] Candes, Emmanuel J., and Yaniv Plan. Matrix completion with noise. Proceedings of the IEEE 98.6 (2010): 925-936.
[2] Balcan, Maria-Florina F., and Hongyang Zhang. Noise-tolerant life-long matrix completion via adaptive sampling. Advances in Neural Information Processing Systems 29 (2016).
[3] B´ecigneul, Gary, and Octavian-Eugen Ganea. Riemannian adaptive optimization methods. arXiv preprint arXiv:1810.00760 (2018).
[4] Bertsimas, Dimitris, and Michael Lingzhi Li. Fast exact matrix completion: A unified optimization framework for matrix completion. Journal of Machine Learning Research 21.231 (2020): 1-43.
[5] Bhargava, Aniruddha, Ravi Ganti, and Rob Nowak. Active positive semidefinite matrix completion: Algorithms, theory and applications. Artificial Intelligence and Statistics. PMLR, 2017.
[6] Cai, Jian-Feng, Emmanuel J. Cand`es, and Zuowei Shen. A singular value thresholding algorithm for matrix completion. SIAM Journal on optimization 20.4 (2010): 1956-1982.
[7] Cand`es, Emmanuel J., and Benjamin Recht. Exact matrix completion via convex optimization. Foundations of Computational mathematics 9.6 (2009): 717-772.
[8] Cand`es, Emmanuel J., and Terence Tao. The power of convex relaxation: Near-optimal matrix completion. IEEE Transactions on Information Theory 56.5 (2010): 2053-2080.
[9] Candes, Emmanuel J., and Benjamin Recht. Exact low-rank matrix completion via convex optimization. 2008 46th Annual Allerton Conference on Communication, Control, and Computing. IEEE, 2008.
[10] Chistov, Alexander L., and D. Yu Grigor’Ev. Complexity of quantifier elimination in the theory of algebraically closed fields. International Symposium on Mathematical Foundations of Computer Science. Springer, Berlin, Heidelberg, 1984.
[11] Deshpande, Amit, et al. Matrix approximation and projective clustering via volume sampling. Theory of Computing 2.1 (2006): 225-247.
[12] Frieze, Alan, Ravi Kannan, and Santosh Vempala. Fast Monte-Carlo algorithms for finding low-rank approximations. Journal of the ACM (JACM) 51.6 (2004): 1025-1041.
[13] Gross, David. Recovering low-rank matrices from few coefficients in any basis. IEEE Transactions on Information Theory 57.3 (2011): 1548-1566.
[14] Helman, Paul, Bernard ME Moret, and Henry D. Shapiro. An exact characterization of greedy structures. SIAM Journal on Discrete Mathematics 6.2 (1993): 274-283.
[15] Huang, Yuge, et al. Curricularface: adaptive curriculum learning loss for deep face recognition. proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2020.
[16] Jain, Prateek, and Praneeth Netrapalli. Fast exact matrix completion with finite samples. Conference on Learning Theory. PMLR, 2015.
[17] Keshavan, Raghunandan, Andrea Montanari, and Sewoong Oh. Matrix completion from noisy entries. Advances in neural information processing systems 22 (2009).
[18] Kong, Yajing, et al. Adaptive Curriculum Learning. Proceedings of the IEEE/CVF International Conference on Computer Vision. 2021.
[19] Krishnamurthy, Akshay, and Aarti Singh. Low-rank matrix and tensor completion via adaptive sampling. Advances in neural information processing systems 26 (2013).
[20] Krishnamurthy, Akshay, and Aarti Singh. On the power of adaptivity in matrix completion and approximation. arXiv preprint arXiv:1407.3619 (2014).
[21] Ma, Jerry, and Denis Yarats. On the adequacy of untuned warmup for adaptive optimization. arXiv preprint arXiv:1910.04209 7 (2019).
[22] Mısır, Mustafa. Active matrix completion for algorithm selection. International Conference on Machine Learning, Optimization, and Data Science. Springer, Cham, 2019.
[23] Paramythis, Alexandros, and Susanne Loidl-Reisinger. Adaptive learning environments and e-learning standards. Second european conference on e-learning. Vol. 1. No. 2003. 2003.
[24] Ramazanli, Ilqar. Adaptive noisy matrix completion. arXiv preprint arXiv:2203.08340 (2022).
[25] Ramazanli, Ilqar. Lifelong matrix completion with sparsity-number. arXiv preprint arXiv:2203.07637 (2022).
[26] Ramazanli, Ilqar. Adaptive noisy matrix completion. arXiv preprint arXiv:2203.08340 (2022).
[27] Ramazanli, Ilqar, and Barnabas Poczos. Optimal exact matrix completion under new parametrization. arXiv preprint arXiv:2002.02431 (2020).
[28] Ramazanli, Ilqar, et al. Adaptive sampling distributed stochastic variance reduced gradient for heterogeneous distributed datasets. arXiv preprint arXiv:2002.08528 (2020).
[29] Recht, Benjamin. A simpler approach to matrix completion. Journal of Machine Learning Research 12.12 (2011).
[30] Riedmiller, Martin, and Heinrich Braun. Rprop-a fast adaptive learning algorithm. Proc. of ISCIS VII), Universitat. 1992.
[31] Rohde, Angelika, and Alexandre B. Tsybakov. Estimation of high-dimensional low-rank matrices. The Annals of Statistics 39.2 (2011): 887-930.
[32] Ruchansky, Natali, Mark Crovella, and Evimaria Terzi. Matrix completion with queries. Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining. 2015.
[33] Settles, Burr. Active learning literature survey. (2009).
[34] Xie, Kun, et al. Low cost and high accuracy data gathering in WSNs with matrix completion. IEEE Transactions on Mobile Computing 17.7 (2017): 1595-1608.
[35] Zeiler, Matthew D. Adadelta: an adaptive learning rate method. arXiv preprint arXiv:1212.5701 (2012).
[36] Zhao, R., et al. An Adaptive Curriculum Learning Framework for Unbiased Glaucoma Diagnosis. Proceedings of the Computer Vision—ECCV (2020).
[37] Zhang, Hui, et al. Strongly convex programming for exact matrix completion and robust principal component analysis. arXiv preprint arXiv:1112.3946 (2011).
[38] Arnold, Matthew, et al. A survey of adaptive optimization in virtual machines. Proceedings of the IEEE 93.2 (2005): 449-466.