Convergence of Gradient Method with Momentum for Back-Propagation Neural Networks

Preview Full PDF

Authors

Abstract

In this work, a gradient method with momentum for BP neural networks is considered. The momentum coefficient is chosen in an adaptive manner to accelerate and stabilize the learning procedure of the network weights. Corresponding convergence results are proved.

About this article

Abstract View

Pdf View

How to Cite

Convergence of Gradient Method with Momentum for Back-Propagation Neural Networks. (2018). Journal of Computational Mathematics, 26(4), 613-623. https://gsp.tricubic.dev/JCM/article/view/11901