Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32807
Accelerating Quantum Chemistry Calculations: Machine Learning for Efficient Evaluation of Electron-Repulsion Integrals

Authors: Nishant Rodrigues, Nicole Spanedda, Chilukuri K. Mohan, Arindam Chakraborty

Abstract:

A crucial objective in quantum chemistry is the computation of the energy levels of chemical systems. This task requires electron-repulsion integrals as inputs and the steep computational cost of evaluating these integrals poses a major numerical challenge in efficient implementation of quantum chemical software. This work presents a moment-based machine learning approach for the efficient evaluation of electron-repulsion integrals. These integrals were approximated using linear combinations of a small number of moments. Machine learning algorithms were applied to estimate the coefficients in the linear combination. A random forest approach was used to identify promising features using a recursive feature elimination approach, which performed best for learning the sign of each coefficient, but not the magnitude. A neural network with two hidden layers was then used to learn the coefficient magnitudes, along with an iterative feature masking approach to perform input vector compression, identifying a small subset of orbitals whose coefficients are sufficient for the quantum state energy computation. Finally, a small ensemble of neural networks (with a median rule for decision fusion) was shown to improve results when compared to a single network.

Keywords: Quantum energy calculations, atomic orbitals, electron-repulsion integrals, ensemble machine learning, random forests, neural networks, feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 49

References:


[1] B. Peng and K. Kowalski, “Highly efficient and scalable compound decomposition of two-electron integral tensor and its application in coupled cluster calculations,” Journal of chemical theory and computation, vol. 13, no. 9, pp. 4179–4192, 2017.
[2] F. Weigend, “A fully direct ri-hf algorithm: Implementation, optimised auxiliary basis sets, demonstration of accuracy and efficiency,” Physical Chemistry Chemical Physics, vol. 4, no. 18, pp. 4285–4291, 2002.
[3] H. Koch, A. S´anchez de Mer´as, and T. B. Pedersen, “Reduced scaling in electronic structure calculations using cholesky decompositions,” The Journal of chemical physics, vol. 118, no. 21, pp. 9481–9484, 2003.
[4] J. Jackson, Classical Electrodynamics. Wiley, 2021.
[Online]. Available: https://books.google.com/books?id=6VV-EAAAQBAJ
[5] B. Nagy and F. Jensen, “Basis sets in quantum chemistry,” Reviews in Computational Chemistry, vol. 30, pp. 93–149, 2017.
[6] A. Szabo and N. S. Ostlund, Modern quantum chemistry: introduction to advanced electronic structure theory. Courier Corporation, 2012.
[7] T. Helgaker, P. Jorgensen, and J. Olsen, Molecular electronic-structure theory. John Wiley & Sons, 2013.
[8] L. Breiman, “Random forests,” Machine Learning, vol. 45, pp. 5–32, 2001.
[9] I. Guyon, J. Weston, S. Barnhill, and V. Vapnik, “Gene selection for cancer classification using support vector machines,” Mach. Learn., vol. 46, no. 1–3, p. 389–422, mar 2002.
[Online]. Available: https://doi.org/10.1023/A:1012487302797
[10] A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Kopf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, “Pytorch: An imperative style, high-performance deep learning library,” in Advances in Neural Information Processing Systems 32. Curran Associates, Inc., 2019, pp. 8024–8035.
[Online]. Available: http://papers.neurips.cc/paper/ 9015-pytorch-an-imperative-style-high-performance-deep-learning-library. pdf
[11] M. Innes, E. Saba, K. Fischer, D. Gandhi, M. C. Rudilosso, N. M. Joy, T. Karmali, A. Pal, and V. Shah, “Fashionable modelling with flux,” CoRR, vol. abs/1811.01457, 2018.
[Online]. Available: https://arxiv.org/abs/1811.01457
[12] G. Bebis and M. Georgiopoulos, “Feed-forward neural networks,” IEEE Potentials, vol. 13, no. 4, pp. 27–31, 1994.