Ссылки

DOI

This work is devoted to the construction and analysis of the stochastic gradient descent method with momentum, inspired by the Runge–Kutta–Chebyshev method. The stability interval for this type of explicit scheme can be sufficiently enlarged in comparison with standard methods. This method provides opportunities for constructing optimization methods with better stability. This study focuses on the unconstrained convex problems. Theoretical results are presented for sufficiently smooth convex functions under general assumptions. Sufficient conditions, that guarantee the convergence of the proposed method are formulated and the expression for the convergence rate is obtained. The obtained theoretical results are supported by numerical experiments on quadratic and non-quadratic convex functions. Numerical results are compared with the results of computations performed by other methods. As a result, the proposed method with the least possible number of stages leads to the least number of epochs and time required to obtain the minimizer with the prescribed accuracy.
Язык оригиналаанглийский
ЖурналNumerical Algorithms
DOI
СостояниеЭлектронная публикация перед печатью - окт 2025

ID: 143011546