Abstract

The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.
Original languageEnglish
Title of host publication10th International Workshop on Simulation and Statistics
Subtitle of host publicationWorkshop booklet
Place of PublicationSalzburg
PublisherUniversitat Salzburg
Pages89
Publication statusPublished - Sep 2019
Event10th International Workshop on Simulation and Statistics
- Salzburg
Duration: 2 Sep 20196 Sep 2019

Conference

Conference10th International Workshop on Simulation and Statistics
CountryAustralia
CitySalzburg
Period2/09/196/09/19

Fingerprint

Random Search
Extremum
Search Methods
Dynamic Optimization
Stochastic Optimization
Control Parameter
Global Optimization
Covariance matrix
Optimization Methods
Gaussian distribution
Genus
Converge
Decrease
Zero

Scopus subject areas

  • Mathematics(all)

Cite this

Vladimirova, L. V., & Ermakov, S. M. (2019). Random Search Method with a “Memory” for Global Extremum of a Function. In 10th International Workshop on Simulation and Statistics: Workshop booklet (pp. 89). Salzburg: Universitat Salzburg.
Vladimirova, Liudmila Vasilevna ; Ermakov, Sergey Michaylovich. / Random Search Method with a “Memory” for Global Extremum of a Function. 10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg : Universitat Salzburg, 2019. pp. 89
@inproceedings{dbaf0903c5124f6d8644724c452291a3,
title = "Random Search Method with a “Memory” for Global Extremum of a Function",
abstract = "The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.",
author = "Vladimirova, {Liudmila Vasilevna} and Ermakov, {Sergey Michaylovich}",
year = "2019",
month = "9",
language = "English",
pages = "89",
booktitle = "10th International Workshop on Simulation and Statistics",
publisher = "Universitat Salzburg",
address = "Austria",

}

Vladimirova, LV & Ermakov, SM 2019, Random Search Method with a “Memory” for Global Extremum of a Function. in 10th International Workshop on Simulation and Statistics: Workshop booklet. Universitat Salzburg, Salzburg, pp. 89, Salzburg, 2/09/19.

Random Search Method with a “Memory” for Global Extremum of a Function. / Vladimirova, Liudmila Vasilevna; Ermakov, Sergey Michaylovich.

10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg : Universitat Salzburg, 2019. p. 89.

Research output

TY - GEN

T1 - Random Search Method with a “Memory” for Global Extremum of a Function

AU - Vladimirova, Liudmila Vasilevna

AU - Ermakov, Sergey Michaylovich

PY - 2019/9

Y1 - 2019/9

N2 - The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.

AB - The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.

M3 - Conference contribution

SP - 89

BT - 10th International Workshop on Simulation and Statistics

PB - Universitat Salzburg

CY - Salzburg

ER -

Vladimirova LV, Ermakov SM. Random Search Method with a “Memory” for Global Extremum of a Function. In 10th International Workshop on Simulation and Statistics: Workshop booklet. Salzburg: Universitat Salzburg. 2019. p. 89