DOI

Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.

Язык оригиналаанглийский
Название основной публикацииAmerican Control Conference, ACC 2020
ИздательInstitute of Electrical and Electronics Engineers Inc.
Страницы3627-3632
Число страниц6
ISBN (электронное издание)9781538682661
ISBN (печатное издание)9781538682661
DOI
СостояниеОпубликовано - июл 2020
Событие2020 American Control Conference, ACC 2020 - Denver, Соединенные Штаты Америки
Продолжительность: 1 июл 20203 июл 2020

Серия публикаций

НазваниеProceedings of the American Control Conference
ISSN (печатное издание)0743-1619

конференция

конференция2020 American Control Conference, ACC 2020
Страна/TерриторияСоединенные Штаты Америки
ГородDenver
Период1/07/203/07/20

    Предметные области Scopus

  • Электротехника и электроника

ID: 62023361