Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization. / Senov, Aleksandr; Granichin, Oleg; Granichina, Olga.
American Control Conference, ACC 2020. Institute of Electrical and Electronics Engineers Inc., 2020. p. 3627-3632 9147989 (Proceedings of the American Control Conference).Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
}
TY - GEN
T1 - A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization
AU - Senov, Aleksandr
AU - Granichin, Oleg
AU - Granichina, Olga
PY - 2020/7
Y1 - 2020/7
N2 - Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.
AB - Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.
KW - Radio frequency
KW - Gradient methods
KW - minimization
KW - Convex functions
KW - Convergence
UR - http://www.scopus.com/inward/record.url?scp=85089600019&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/f8cd5650-69e2-3771-be93-236488256acb/
U2 - 10.23919/ACC45564.2020.9147989
DO - 10.23919/ACC45564.2020.9147989
M3 - Conference contribution
AN - SCOPUS:85089600019
SN - 9781538682661
T3 - Proceedings of the American Control Conference
SP - 3627
EP - 3632
BT - American Control Conference, ACC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 American Control Conference, ACC 2020
Y2 - 1 July 2020 through 3 July 2020
ER -
ID: 62023361