Research output: Chapter in Book/Report/Conference proceeding › Conference contribution › Research › peer-review
Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.
Original language | English |
---|---|
Title of host publication | American Control Conference, ACC 2020 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 3627-3632 |
Number of pages | 6 |
ISBN (Electronic) | 9781538682661 |
ISBN (Print) | 9781538682661 |
DOIs | |
State | Published - Jul 2020 |
Event | 2020 American Control Conference, ACC 2020 - Denver, United States Duration: 1 Jul 2020 → 3 Jul 2020 |
Name | Proceedings of the American Control Conference |
---|---|
ISSN (Print) | 0743-1619 |
Conference | 2020 American Control Conference, ACC 2020 |
---|---|
Country/Territory | United States |
City | Denver |
Period | 1/07/20 → 3/07/20 |
ID: 62023361