Forschungsgruppe "Stochastische Algorithmen und Nichtparametrische Statistik"

Research Seminar "Mathematical Statistics" SS 2021

14.04.2021 N.N.

21.04.2021 N.N.

28.04.2021 N. N.

05.05.2021 N.N.

12.05.2021 N.N.

19.05.2021 Hannes Leeb (University of Vienna)
A (tight) upper bound for the length of confidence intervals with conditional coverage
We show that two popular selective inference procedures, namely data carving (Fithian et al., 2017) and selection with a randomized response (Tian et al., 2018b), when combined with the polyhedral method (Lee et al., 2016), result in confidence intervals whose length is bounded. This contrasts results for confidence intervals based on the polyhedral method alone, whose expected length is typically infinite (Kivaranovic and Leeb, 2020). Moreover, we show that these two procedures always dominate corresponding sample-splitting methods in terms of interval length.
26.05.2021 Hans-Georg Müller (UC Davis)
exceptionally at 9 a.m.! Functional models for time-varying random objects
In recent years, samples of random objects and time-varying object data such as time-varying distributions or networks that are not in a vector space have become increasingly prevalent. Such data can be viewed as elements of a general metric space that lacks local or global linear structure. Common approaches that have been used with great success for the analysis of functional data, such as functional principal component analysis, are therefore not applicable. The concept of metric covariance makes it possible to define a metric auto-covariance function for a sample of random curves that take values in a general metric space and it can be shown to be non-negative definite when the squared semi-metric of the underlying space is of negative type. Then the eigenfunctions of the linear operator with the auto-covariance function as kernel can be used as building blocks for an object functional principal component analysis, which includes real-valued Frechet scores and metric-space valued object functional principal components. Sample based estimates of these quantities are shown to be asymptotically consistent and are illustrated with various data. (Joint work with Paromita Dubey, Stanford University.)
02.06.2021 N.N.

09.06.2021 N. N.

16.06.2021 Irène Gijbels (KU Leuven)
exceptionally at 12:30 p.m.! Extremiles and extremile regression
Quantiles and expectiles of a distribution are found to be useful descriptors of its tail in the same way as the median and mean are related to its central behavior. In this talk we discuss an alternative class to expectiles, called extremiles. The new class is motivated via several angles, which reveals its speci c merits and strengths. Extremiles suggest better capability of tting both location and spread in data points and provide an appropriate theory that better displays the interesting features of long-tailed distributions. We brie y discuss estimation of extremiles. A large part of the talk will be on regression extremiles, which thus de ne a least squares analogue of regression quantiles.We discuss estimation of conditional extremiles, in which we rely on local linear (least squares) checkfunction minimization. An asymptotic normality result for the estimators is established. Attention also goes to extending extremile regression far into the tails of heavy-tailed distributions. For this purpose extrapolated estimators are constructed and their asymptotic theory is developed. Applications to real data illustrate how extremiles and related tools can be used in practice.)
23.06.2021 N.N.

30.06.2021 N.N.

07.07.2021 Victor Panaretos (EPFL Lausanne)
14.07,2021 N.N.

last reviewed: June 2, 2021 by Christine Schneider