Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › статья в сборнике материалов конференции › Рецензирование
Stochastic fast gradient for tracking. / Kosaty, Dmitry; Vakhitov, Alexander; Granichin, Oleg; Yuchi, Ming.
2019 American Control Conference, ACC 2019. Institute of Electrical and Electronics Engineers Inc., 2019. стр. 1476-1481 8815070 (Proceedings of the American Control Conference; Том 2019-July).Результаты исследований: Публикации в книгах, отчётах, сборниках, трудах конференций › статья в сборнике материалов конференции › Рецензирование
}
TY - GEN
T1 - Stochastic fast gradient for tracking
AU - Kosaty, Dmitry
AU - Vakhitov, Alexander
AU - Granichin, Oleg
AU - Yuchi, Ming
PY - 2019/7/1
Y1 - 2019/7/1
N2 - In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.
AB - In recent applications, first-order optimization methods are often applied in the non-stationary setting when the minimum point is drifting in time, addressing a so-called parameter tracking, or non-stationary optimization (NSO) problem. In this paper, we propose a new method for NSO derived from Nesterov's Fast Gradient. We derive theoretical bounds on the expected estimation error. We illustrate our results with simulation showing that the proposed method gives more accurate estimates of the minimum points than the unmodified Fast Gradient or Stochastic Gradient in case of deterministic drift while in purely random walk all methods behave similarly. The proposed method can be used to train convolutional neural networks to obtain super-resolution of digital surface models.
UR - http://www.scopus.com/inward/record.url?scp=85072273133&partnerID=8YFLogxK
UR - http://www.mendeley.com/research/constrained-hierarchical-mpc-via-zonotopic-waysets
M3 - Conference contribution
AN - SCOPUS:85072273133
SN - 9781538679265
T3 - Proceedings of the American Control Conference
SP - 1476
EP - 1481
BT - 2019 American Control Conference, ACC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 American Control Conference, ACC 2019
Y2 - 10 July 2019 through 12 July 2019
ER -
ID: 46583267