Standard

Stochastic fast gradient for tracking. / Kosaty, Dmitry; Vakhitov, Alexander; Granichin, Oleg; Yuchi, Ming.

2019 American Control Conference, ACC 2019. Institute of Electrical and Electronics Engineers Inc., 2019. стр. 1476-1481 8815070 (Proceedings of the American Control Conference; Том 2019-July).

Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференцийстатья в сборнике материалов конференцииРецензирование

Harvard

Kosaty, D, Vakhitov, A, Granichin, O & Yuchi, M 2019, Stochastic fast gradient for tracking. в 2019 American Control Conference, ACC 2019., 8815070, Proceedings of the American Control Conference, Том. 2019-July, Institute of Electrical and Electronics Engineers Inc., стр. 1476-1481, 2019 American Control Conference, ACC 2019, Philadelphia, Соединенные Штаты Америки, 10/07/19.

APA

Kosaty, D., Vakhitov, A., Granichin, O., & Yuchi, M. (2019). Stochastic fast gradient for tracking. в 2019 American Control Conference, ACC 2019 (стр. 1476-1481). [8815070] (Proceedings of the American Control Conference; Том 2019-July). Institute of Electrical and Electronics Engineers Inc..

Vancouver

Kosaty D, Vakhitov A, Granichin O, Yuchi M. Stochastic fast gradient for tracking. в 2019 American Control Conference, ACC 2019. Institute of Electrical and Electronics Engineers Inc. 2019. стр. 1476-1481. 8815070. (Proceedings of the American Control Conference).

Author

Kosaty, Dmitry ; Vakhitov, Alexander ; Granichin, Oleg ; Yuchi, Ming. / Stochastic fast gradient for tracking. 2019 American Control Conference, ACC 2019. Institute of Electrical and Electronics Engineers Inc., 2019. стр. 1476-1481 (Proceedings of the American Control Conference).

BibTeX

@inproceedings{adb98274795a4cb99618240052ae92d3,
title = "Stochastic fast gradient for tracking",
abstract = "In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.",
author = "Dmitry Kosaty and Alexander Vakhitov and Oleg Granichin and Ming Yuchi",
year = "2019",
month = jul,
day = "1",
language = "English",
isbn = "9781538679265",
series = "Proceedings of the American Control Conference",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1476--1481",
booktitle = "2019 American Control Conference, ACC 2019",
address = "United States",
note = "2019 American Control Conference, ACC 2019 ; Conference date: 10-07-2019 Through 12-07-2019",

}

RIS

TY - GEN

T1 - Stochastic fast gradient for tracking

AU - Kosaty, Dmitry

AU - Vakhitov, Alexander

AU - Granichin, Oleg

AU - Yuchi, Ming

PY - 2019/7/1

Y1 - 2019/7/1

N2 - In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.

AB - In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.

UR - http://www.scopus.com/inward/record.url?scp=85072273133&partnerID=8YFLogxK

UR - http://www.mendeley.com/research/constrained-hierarchical-mpc-via-zonotopic-waysets

M3 - Conference contribution

AN - SCOPUS:85072273133

SN - 9781538679265

T3 - Proceedings of the American Control Conference

SP - 1476

EP - 1481

BT - 2019 American Control Conference, ACC 2019

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2019 American Control Conference, ACC 2019

Y2 - 10 July 2019 through 12 July 2019

ER -

ID: 46583267