Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.

Original languageEnglish
Title of host publicationAmerican Control Conference, ACC 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages3627-3632
Number of pages6
ISBN (Electronic)9781538682661
ISBN (Print)9781538682661
DOIs
StatePublished - Jul 2020
Event2020 American Control Conference, ACC 2020 - Denver, United States
Duration: 1 Jul 20203 Jul 2020

Publication series

NameProceedings of the American Control Conference
ISSN (Print)0743-1619

Conference

Conference2020 American Control Conference, ACC 2020
Country/TerritoryUnited States
CityDenver
Period1/07/203/07/20

    Scopus subject areas

  • Electrical and Electronic Engineering

    Research areas

  • Radio frequency, Gradient methods, minimization, Convex functions, Convergence

ID: 62023361