Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 33122
Boosting Method for Automated Feature Space Discovery in Supervised Quantum Machine Learning Models

Authors: Vladimir Rastunkov, Jae-Eun Park, Abhijit Mitra, Brian Quanz, Steve Wood, Christopher Codella, Heather Higgins, Joseph Broz

Abstract:

Quantum Support Vector Machines (QSVM) have become an important tool in research and applications of quantum kernel methods. In this work we propose a boosting approach for building ensembles of QSVM models and assess performance improvement across multiple datasets. This approach is derived from the best ensemble building practices that worked well in traditional machine learning and thus should push the limits of quantum model performance even further. We find that in some cases, a single QSVM model with tuned hyperparameters is sufficient to simulate the data, while in others - an ensemble of QSVMs that are forced to do exploration of the feature space via proposed method is beneficial.

Keywords: QSVM, Quantum Support Vector Machines, quantum kernel, boosting, ensemble.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 450

References:


[1] Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” Machine Learning: Proceedings of the Thirteenth International Conference, 1996.
[2] L. Breiman, “Random forests,” Machine Learning, p. 5–32, 2001.
[3] T. Hastie, R. Tibshirani, and J. Friedman, The elements of statistical learning: data mining, inference and prediction. Springer, 2017.
[4] V. Havl´ıˇcek, A. D. C´orcoles, K. Temme, A. W. Harrow, A. Kandala, J. M. Chow, and J. M. Gambetta, “Supervised learning with quantum-enhanced feature spaces,” Nature, vol. 567, no. 7747, pp. 209–212, 2019.
[5] P. Rebentrost, M. Mohseni, and S. Lloyd, “Quantum support vector machine for big data classification,” Physical review letters, vol. 113, no. 13, p. 130503, 2014.
[6] J.-E. Park, B. Quanz, S. Wood, H. Higgins, and R. Harishankar, “Practical application improvement to quantum svm: theory to practice,” arXiv:2012.07725, 2020.
[7] S. L. Wu, S. Sun, W. Guan, C. Zhou, J. Chan, C. L. Cheng, T. Pham, Y. Qian, A. Z. Wang, R. Zhang, M. Livny, J. Glick, P. K. Barkoutsos, S. Woerner, I. Tavernelli, F. Carminati, A. D. Meglio, A. C. Y. Li, J. Lykken, P. Spentzouris, S. Y.-C. Chen, S. Yoo, and T.-C. Wei, “Application of quantum machine learning using the quantum kernel algorithm on high energy physics analysis at the lhc,” 2021.
[8] J. R. Glick, T. P. Gujarati, A. D. Corcoles, Y. Kim, A. Kandala, J. M. Gambetta, and K. Temme, “Covariant quantum kernels for data with group structure,” 2021.
[9] H. Neven, V. S. Denchev, G. Rose, and W. G. Macready, “Training a large scale classifier with the quantum adiabatic algorithm,” arXiv:0912.0779, 2009.
[10] M. Schuld and F. Petruccione, “Quantum ensembles of quantum classifiers,” Sci Rep, vol. 8, no. 2772, 2018.
[11] A. Abbas, M. Schuld, and F. Petruccione, “On quantum ensembles of quantum classifiers,” arXiv:2001.10833, 2020.
[12] M. Schuld, “Quantum machine learning models are kernel methods,” arXiv:2101.11020, 2021.
[13] C. Wade, Hands-On Gradient Boosting with XGBoost and scikit-learn (electronic resource) / Wade, Corey., 1st ed. Packt Publishing, 2020.