Abstract

In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.

Original languageEnglish
Title of host publication2019 American Control Conference, ACC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1476-1481
ISBN (Print)9781538679265
Publication statusPublished - 1 Jul 2019
Event2019 American Control Conference, ACC 2019 - Philadelphia
Duration: 10 Jul 201912 Jul 2019

Publication series

NameProceedings of the American Control Conference
Volume2019-July
ISSN (Print)0743-1619

Conference

Conference2019 American Control Conference, ACC 2019
CountryUnited States
CityPhiladelphia
Period10/07/1912/07/19

Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Stochastic fast gradient for tracking'. Together they form a unique fingerprint.

  • Cite this

    Kosaty, D., Vakhitov, A., Granichin, O., & Yuchi, M. (2019). Stochastic fast gradient for tracking. In 2019 American Control Conference, ACC 2019 (pp. 1476-1481). [8815070] (Proceedings of the American Control Conference; Vol. 2019-July). Institute of Electrical and Electronics Engineers Inc..