Recent Developments in Inverse Problems - Abstract

Grasmair, Markus

Multiscale nonparametric regression

In this talk, we will study the application of the multi-resolution norm to statistical inverse problems and, in particular, to the problem of non-parametric regression, and we will derive convergence rates with respect to the $L^2$-norm for a variational scheme based on the discrepancy principle. In contrast to deterministic inverse problems, where one usually does not have any knowledge about the noise apart from its size, which is assumed to be close to zero, we base our method on the assumption of independent and identically distributed noise, but do not impose any restrictions on its size. In such a setting, the convergence rates (or error estimates) one obtains will not depend on the noise level, but rather on the number of measurements one takes. We will derive convergence rates by using a modification of the technique of approximate source conditions. In the particular case of one-dimensional non-parametric regression, the distance function needed in the formulation of these approximate source conditions can be estimated using approximation properties of $B$-splines, which then allows us to formulate rates in terms of Besov (or Sobolev) regularity of the true solution. The rates obtained in that manner turn out to be optimal in a large variety of settings, even if the smoothness class of the true solution is not known a--priori.