Publications

Articles in Refereed Journals

  • L. Liang, Z. Xu, K.-Ch. Toh, J.-J. Zhu, An inexact Halpern iteration with application to distributionally robust optimization, Journal of Optimization Theory and Applications, 206 (2025), pp. 58/1--58/41, DOI 10.1007/s10957-025-02744-y .

  • M. Liero, A. Mielke, O. Tse, J.-J. Zhu, Evolution of Gaussians in the Hellinger--Kantorovich--Boltzmann gradient flow, Communications on Pure and Applied Analysis, (2025), pp. 1--33 (published online on 01.10.2025), DOI 10.3934/cpaa.2025105 .
    Abstract
    This study leverages the basic insight that the gradient-flow equation associated with the relative Boltzmann entropy, in relation to a Gaussian reference measure within the Hellinger--Kantorovich (HK) geometry, preserves the class of Gaussian measures. This invariance serves as the foundation for constructing a reduced gradient structure on the parameter space characterizing Gaussian densities. We derive explicit ordinary differential equations that govern the evolution of mean, covariance, and mass under the HK--Boltzmann gradient flow. The reduced structure retains the additive form of the HK metric, facilitating a comprehensive analysis of the dynamics involved. We explore the geodesic convexity of the reduced system, revealing that global convexity is confined to the pure transport scenario, while a variant of sublevel semi-convexity is observed in the general case. Furthermore, we demonstrate exponential convergence to equilibrium through Polyak--Łojasiewicz-type inequalities, applicable both globally and on sublevel sets. By monitoring the evolution of covariance eigenvalues, we refine the decay rates associated with convergence. Additionally, we extend our analysis to non-Gaussian targets exhibiting strong log-Lambda-concavity, corroborating our theoretical results with numerical experiments that encompass a Gaussian-target gradient flow and a Bayesian logistic regression application.

Preprints, Reports, Technical Reports

  • A. Mielke, J.-J. Zhu, Hellinger--Kantorovich gradient flows: Global exponential decay of entropy functionals, Preprint no. 3176, WIAS, Berlin, 2025, DOI 10.20347/WIAS.PREPRINT.3176 .
    Abstract, PDF (508 kByte)
    We investigate a family of gradient flows of positive and probability measures, focusing on the Hellinger--Kantorovich (HK) geometry, which unifies transport mechanism of Otto--Wasserstein, and the birth-death mechanism of Hellinger (or Fisher--Rao). A central contribution is a complete characterization of global exponential decay behaviors of entropy functionals under Otto--Wasserstein and Hellinger-type gradient flows. In particular, for the more challenging analysis of HK gradient flows on positive measures---where the typical log-Sobolev arguments fail---we develop a specialized shape-mass decomposition that enables new analysis results. Our approach also leverages the Polyak--Łojasiewicz-type functional inequalities and a careful extension of classical dissipation estimates. These findings provide a unified and complete theoretical framework for gradient flows and underpin applications in computational algorithms for statistical inference, optimization, and machine learning.

  • CH. Bayer, L. Pelizzari, J.-J. Zhu, Pricing American options under rough volatility using deep-signatures and signature-kernels, Preprint no. 3172, WIAS, Berlin, 2025, DOI 10.20347/WIAS.PREPRINT.3172 .
    Abstract, PDF (650 kByte)
    We extend the signature-based primal and dual solutions to the optimal stopping problem recently introduced in [Bayer et al.: Primal and dual optimal stopping with signatures, to ap- pear in Finance & Stochastics 2025], by integrating deep-signature and signature-kernel learning methodologies. These approaches are designed for non-Markovian frameworks, in particular en- abling the pricing of American options under rough volatility. We demonstrate and compare the performance within the popular rough Heston and rough Bergomi models.

Talks, Poster

  • J.-J. Zhu, Contrasting and combining Wasserstein and Fisher-Rao gradient flows for relative entropy minimization, 17th ICSP conference on Stochastic Programming, July 28 - August 1, 2025, École des Ponts, IP Paris, France, July 28, 2025.

  • J.-J. Zhu, Gradient flows for sampling: Two new machine learning applications, Particles, Flows, and Maps: Modern Approaches for Sampling Complex Distributions, November 3 - 6, 2025, Bernoulli Center (EPFL), Lausanne, Switzerland, November 5, 2025.

  • J.-J. Zhu, Gradient flows in the (marginal) entropy-regularized transport geometry for sampling and inference, Workshop on Bayesian Analysis and Artificial Intelligence, September 10 - 12, 2025, Peking University, Center for Statistical Science, China, September 11, 2025.

  • J.-J. Zhu, Hellinger and Fisher-Rao: from gradient flows to distributional robustness, Mathematical Analysis of Adversarial Machine Learning, August 17 - 22, 2025, Casa Matemática Oaxaca, Oaxaca, Mexico, August 18, 2025.

  • J.-J. Zhu, Inclusive KL minimization: A Wasserstein-Fisher-Rao gradient flow perspective, AABI 2025 - 7th Symposium on Advances of Approximate Bayesian Inference, Singapur, Singapore, April 29, 2025.

  • J.-J. Zhu, Minimizing relative entropy with Hellinger-Kantorovich (a.k.a. Wasserstein-Fisher-Rao) gradient flows, Understanding Generalization in Deep Learning, February 19 - 21, 2025, Technische Universität München, Department of Mathematics, Burghausen, February 19, 2025.

  • J.-J. Zhu, Minimizing relative entropy with Hellinger-Kantorovich (a.k.a. Wasserstein-Fisher-Rao) gradient flows, March 13 - 14, 2025, Ludwig Maximilian University of Munich, Department of Mathematics, March 13, 2025.

  • J.-J. Zhu, Two new machine learning applications of gradient flows for sampling, Gradient Flows face-to-face, September 16 - 19, 2025, Universidad de Granada, Istituto de Matemáticas, Spain, September 18, 2025.

External Preprints

  • E. Gladin, A. Kroshnin, J.-J. Zhu, P. Dvurechensky, Improved stochastic optimization of LogSumExp, Preprint no. arXiv:2509.24894, Cornell University, 2025, DOI 10.48550/arXiv.2509.24894 .