Результаты исследований: Научные публикации в периодических изданиях › статья › Рецензирование
Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method. / Кривовичев, Герасим Владимирович; Кириллов, Роман Борисович.
в: Journal of Computational and Applied Mathematics, Том 483, 117434, 01.09.2026.Результаты исследований: Научные публикации в периодических изданиях › статья › Рецензирование
}
TY - JOUR
T1 - Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method
AU - Кривовичев, Герасим Владимирович
AU - Кириллов, Роман Борисович
PY - 2026/9/1
Y1 - 2026/9/1
N2 - This paper is devoted to the construction of iterative gradient methods for unconstrained convex optimization based on the explicit second-order non-standard Lagrange–Burmann Runge–Kutta method. The preconditioned gradient descent method and its modification with momentum are constructed. Convergence conditions for the case of a strongly convex quadratic function and its perturbation are obtained. Analytical expressions for optimal parameters, which guarantee the optimal convergence rate, are obtained. Theoretical analysis is supported by numerical examples from different applications (2D and 3D problems for Poisson equation, minimization of integral functional, problem for nonlinear integro-differential equation, regularized logistic regression for different datasets, minimization of the sum of quadratic function and smoothed Huber loss function). Methods are compared with well-known optimal methods for convex unconstrained optimization (gradient descent, Polyak heavy ball, Nesterov methods). As it is demonstrated for the considered examples, the optimal method with momentum, proposed in the presented paper, provides faster convergence and less computation time in comparison with other methods.
AB - This paper is devoted to the construction of iterative gradient methods for unconstrained convex optimization based on the explicit second-order non-standard Lagrange–Burmann Runge–Kutta method. The preconditioned gradient descent method and its modification with momentum are constructed. Convergence conditions for the case of a strongly convex quadratic function and its perturbation are obtained. Analytical expressions for optimal parameters, which guarantee the optimal convergence rate, are obtained. Theoretical analysis is supported by numerical examples from different applications (2D and 3D problems for Poisson equation, minimization of integral functional, problem for nonlinear integro-differential equation, regularized logistic regression for different datasets, minimization of the sum of quadratic function and smoothed Huber loss function). Methods are compared with well-known optimal methods for convex unconstrained optimization (gradient descent, Polyak heavy ball, Nesterov methods). As it is demonstrated for the considered examples, the optimal method with momentum, proposed in the presented paper, provides faster convergence and less computation time in comparison with other methods.
KW - Convex optimization
KW - Iterative methods
KW - Lagrange–Burmann expansion
KW - Runge–Kutta methods
UR - https://www.mendeley.com/catalogue/7dea730d-f775-3255-ba21-00689e8b1e53/
U2 - 10.1016/j.cam.2026.117434
DO - 10.1016/j.cam.2026.117434
M3 - Article
VL - 483
JO - Journal of Computational and Applied Mathematics
JF - Journal of Computational and Applied Mathematics
SN - 0377-0427
M1 - 117434
ER -
ID: 149090301