A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization

Aleksandr Senov, Oleg Granichin, Olga Granichina

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review


Large-scale optimization plays important role in many control and learning problems. Sequential subspace optimization is a novel approach particularly suitable for large-scale optimization problems. It is based on sequential reduction of the initial optimization problem to optimization problems in a low-dimensional space. In this paper we consider a problem of multidimensional convex real-valued function optimization. In a framework of sequential subspace optimization we develop a new method based on a combination of quasi-Newton and conjugate gradient method steps. We provide its formal justification and derive several of its theoretical properties. In particular, for quadratic programming problem we prove linear convergence in a finite number of steps. We demonstrate superiority of the proposed algorithm over common state of the art methods by carrying out comparative analysis on both modelled and real-world optimization problems.

Original languageEnglish
Title of host publicationAmerican Control Conference, ACC 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781538682661
ISBN (Print)9781538682661
StatePublished - Jul 2020
Event2020 American Control Conference, ACC 2020 - Denver, United States
Duration: 1 Jul 20203 Jul 2020

Publication series

NameProceedings of the American Control Conference
ISSN (Print)0743-1619


Conference2020 American Control Conference, ACC 2020
CountryUnited States

Scopus subject areas

  • Electrical and Electronic Engineering


  • Radio frequency
  • Gradient methods
  • minimization
  • Convex functions
  • Convergence

Fingerprint Dive into the research topics of 'A Sequential Subspace Quasi-Newton Method for Large-Scale Convex Optimization'. Together they form a unique fingerprint.

Cite this