Standard

Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method. / Кривовичев, Герасим Владимирович; Кириллов, Роман Борисович.

в: Journal of Computational and Applied Mathematics, Том 483, 117434, 01.09.2026.

Результаты исследований: Научные публикации в периодических изданияхстатьяРецензирование

Harvard

Кривовичев, ГВ & Кириллов, РБ 2026, 'Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method', Journal of Computational and Applied Mathematics, Том. 483, 117434. https://doi.org/10.1016/j.cam.2026.117434

APA

Кривовичев, Г. В., & Кириллов, Р. Б. (2026). Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method. Journal of Computational and Applied Mathematics, 483, [117434]. https://doi.org/10.1016/j.cam.2026.117434

Vancouver

Кривовичев ГВ, Кириллов РБ. Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method. Journal of Computational and Applied Mathematics. 2026 Сент. 1;483. 117434. https://doi.org/10.1016/j.cam.2026.117434

Author

Кривовичев, Герасим Владимирович ; Кириллов, Роман Борисович. / Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method. в: Journal of Computational and Applied Mathematics. 2026 ; Том 483.

BibTeX

@article{cd80d735355944c5a093db98ee605160,
title = "Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method",
abstract = "This paper is devoted to the construction of iterative gradient methods for unconstrained convex optimization based on the explicit second-order non-standard Lagrange–Burmann Runge–Kutta method. The preconditioned gradient descent method and its modification with momentum are constructed. Convergence conditions for the case of a strongly convex quadratic function and its perturbation are obtained. Analytical expressions for optimal parameters, which guarantee the optimal convergence rate, are obtained. Theoretical analysis is supported by numerical examples from different applications (2D and 3D problems for Poisson equation, minimization of integral functional, problem for nonlinear integro-differential equation, regularized logistic regression for different datasets, minimization of the sum of quadratic function and smoothed Huber loss function). Methods are compared with well-known optimal methods for convex unconstrained optimization (gradient descent, Polyak heavy ball, Nesterov methods). As it is demonstrated for the considered examples, the optimal method with momentum, proposed in the presented paper, provides faster convergence and less computation time in comparison with other methods.",
keywords = "Convex optimization, Iterative methods, Lagrange–Burmann expansion, Runge–Kutta methods",
author = "Кривовичев, {Герасим Владимирович} and Кириллов, {Роман Борисович}",
year = "2026",
month = sep,
day = "1",
doi = "10.1016/j.cam.2026.117434",
language = "English",
volume = "483",
journal = "Journal of Computational and Applied Mathematics",
issn = "0377-0427",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Gradient methods based on non-standard Lagrange–Burmann Runge–Kutta method

AU - Кривовичев, Герасим Владимирович

AU - Кириллов, Роман Борисович

PY - 2026/9/1

Y1 - 2026/9/1

N2 - This paper is devoted to the construction of iterative gradient methods for unconstrained convex optimization based on the explicit second-order non-standard Lagrange–Burmann Runge–Kutta method. The preconditioned gradient descent method and its modification with momentum are constructed. Convergence conditions for the case of a strongly convex quadratic function and its perturbation are obtained. Analytical expressions for optimal parameters, which guarantee the optimal convergence rate, are obtained. Theoretical analysis is supported by numerical examples from different applications (2D and 3D problems for Poisson equation, minimization of integral functional, problem for nonlinear integro-differential equation, regularized logistic regression for different datasets, minimization of the sum of quadratic function and smoothed Huber loss function). Methods are compared with well-known optimal methods for convex unconstrained optimization (gradient descent, Polyak heavy ball, Nesterov methods). As it is demonstrated for the considered examples, the optimal method with momentum, proposed in the presented paper, provides faster convergence and less computation time in comparison with other methods.

AB - This paper is devoted to the construction of iterative gradient methods for unconstrained convex optimization based on the explicit second-order non-standard Lagrange–Burmann Runge–Kutta method. The preconditioned gradient descent method and its modification with momentum are constructed. Convergence conditions for the case of a strongly convex quadratic function and its perturbation are obtained. Analytical expressions for optimal parameters, which guarantee the optimal convergence rate, are obtained. Theoretical analysis is supported by numerical examples from different applications (2D and 3D problems for Poisson equation, minimization of integral functional, problem for nonlinear integro-differential equation, regularized logistic regression for different datasets, minimization of the sum of quadratic function and smoothed Huber loss function). Methods are compared with well-known optimal methods for convex unconstrained optimization (gradient descent, Polyak heavy ball, Nesterov methods). As it is demonstrated for the considered examples, the optimal method with momentum, proposed in the presented paper, provides faster convergence and less computation time in comparison with other methods.

KW - Convex optimization

KW - Iterative methods

KW - Lagrange–Burmann expansion

KW - Runge–Kutta methods

UR - https://www.mendeley.com/catalogue/7dea730d-f775-3255-ba21-00689e8b1e53/

U2 - 10.1016/j.cam.2026.117434

DO - 10.1016/j.cam.2026.117434

M3 - Article

VL - 483

JO - Journal of Computational and Applied Mathematics

JF - Journal of Computational and Applied Mathematics

SN - 0377-0427

M1 - 117434

ER -

ID: 149090301