Pedro Pérez-Aros (Universidad de O'Higgins, Rancagua)

In this presentation, we explore a new category of challenging nonsmooth optimization problems. These problems involve the difference between generally nonconvex functions as their objective. We introduce an innovative Newton-type algorithm designed for solving such problems. This algorithm leverages advanced tools from variational analysis and is based on the concept of the coderivative generated second-order subdifferential (generalized Hessian). We discuss the algorithm's well-posedness properties, establishing its viability under broad conditions. Additionally, we analyze its convergence rates, incorporating assumptions such as the Kurdyka–Łojasiewicz condition. The talk ends with some numerical experiments which ilustrate the performance of our development.