Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 87783
An Accelerated Stochastic Gradient Method with Momentum
Authors: Liang Liu, Xiaopeng Luo
Abstract:
In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum
Procedia PDF Downloads 166