Recent Developments in Inverse Problems - Abstract
Kindermann, Stefan
We consider Tikhonov regularization in Banach spaces with linear operators and convex regularization terms and convex fidelity terms. [ T(x):= varphi(A x - y^delta) alpha J(x). ] Minimizers of this functionals, $x_alpha,delta,$ are regularized solutions, and we study rates in the Bregman distance for the convergence of $x_alpha,delta$ to a true solution $x^dagger$. For convergence rates, it is necessary that some smoothness condition for $x^dagger$ is satisfied. In a general context, these are nowadays formulated in terms of variational inequalities. For example, if the following condition (“variational inequality”) with some model function $Phi$ holds, [ beta B(x^dagger;x) leq J(x) - J(x^dagger) Phi(A (x^dagger- x)), ] for all $x$ sufficiently close to $x^dagger$ and with some $beta >0$, where $B$ is the Bregman distance, convergence rates have been proved.
The aim of this talk is to report results on convergence rates under a weaker condition, namely assuming that a variational inequality with $beta =0$ holds. In this case we can prove the same convergence rates as those established in the literature.