### Abstract

Original language | English |
---|---|

Title of host publication | 10th International Workshop on Simulation and Statistics |

Subtitle of host publication | Workshop booklet |

Place of Publication | Salzburg |

Publisher | Universitat Salzburg |

Pages | 89 |

Publication status | Published - Sep 2019 |

Event | 10th International Workshop on Simulation and Statistics - Salzburg Duration: 2 Sep 2019 → 6 Sep 2019 |

### Conference

Conference | 10th International Workshop on Simulation and Statistics |
---|---|

Country | Australia |

City | Salzburg |

Period | 2/09/19 → 6/09/19 |

### Fingerprint

### Scopus subject areas

- Mathematics(all)

### Cite this

*10th International Workshop on Simulation and Statistics: Workshop booklet*(pp. 89). Salzburg: Universitat Salzburg.

}

*10th International Workshop on Simulation and Statistics: Workshop booklet.*Universitat Salzburg, Salzburg, pp. 89, Salzburg, 2/09/19.

**Random Search Method with a “Memory” for Global Extremum of a Function.** / Vladimirova, Liudmila Vasilevna; Ermakov, Sergey Michaylovich.

Research output

TY - GEN

T1 - Random Search Method with a “Memory” for Global Extremum of a Function

AU - Vladimirova, Liudmila Vasilevna

AU - Ermakov, Sergey Michaylovich

PY - 2019/9

Y1 - 2019/9

N2 - The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.

AB - The general scheme of stochastic global optimization methods can be represented as fol-lows. In the regionDof extremum search for the functionf(X),NpointsXj(j=1,...,N) are chosen randomly or quasi-randomly andNvaluesf(Xj) are calculated. OftheNpoints,mpoints are stored, wherefvalues are the largest (smallest). The set ofthese m points is called the zero generation. After this, the iterative Markov algorithmis executed. If thek-th generation ofmkpoints is determined, the method is specified toof obtain the (k+ 1)-th generation ofmk+1points. The methods mentioned provide thesequence of generations to converge with probability 1 to the global extremum point.Our report discusses one of methods of this kind proposed by the authors in 1977.The proposed method idea is to construct the normal density on the basis ofk-th genera-tion points. The points of the next generation are sampled from the normal distribution.The number of points decreases withkgrowth. On final stages it is advisable to use thegradient method.Random extremum search with covariance matrix (search with ”memory”) is convenientfor solving problems of charged beam dynamics optimization. Such problems are dedi-cated to minimization of quality functional by control parameters.

M3 - Conference contribution

SP - 89

BT - 10th International Workshop on Simulation and Statistics

PB - Universitat Salzburg

CY - Salzburg

ER -