In the paper, we present algorithms for minimization of d.c. functions (difference of two convex functions) on the whole space$$R^n$$. Many nonconvex optimization problems can be described using these functions. D.c. functions are used in various applications especially in optimization, but the problem to characterize them is not trivial, due to the fact that these functions are not differentiable and certainly are not convex. The class of these functions is contained in the class of quasidifferentiable functions. Proposed algorithms are based on known necessary optimality conditions and d.c. duality. Convergence to$$\inf $$ -stationary points is established under fairly general natural assumptions.

Original languageEnglish
Title of host publicationComputational Science and Its Applications – ICCSA 2019
Subtitle of host publication19th International Conference, Saint Petersburg, Russia, July 1–4, 2019, Proceedings, Part IV
EditorsSanjay Misra, et al.
PublisherSpringer Nature
Pages667–677
ISBN (Print)9783030243043
DOIs
StatePublished - 2019
Event19th International Conference on Computational Science and Its Applications, ICCSA 2019 - Saint Petersburg, Russian Federation
Duration: 1 Jul 20194 Jul 2019
Conference number: 19

Publication series

NameLNCS
Volume11622
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference19th International Conference on Computational Science and Its Applications, ICCSA 2019
Abbreviated titleICCSA 2019
Country/TerritoryRussian Federation
CitySaint Petersburg
Period1/07/194/07/19

    Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

    Research areas

  • Convex function, Difference of convex functions (d.c. functions), Quasidifferentiable functions, Subdifferential

ID: 43545567