Input/output

My name is Jonas Latz; welcome to my webpage!

Currently, I am a Research Associate in the Department of Applied Mathematics and Theoretical Physics, University of Cambridge. I work with Carola-Bibiane Schönlieb in the Cambridge Image Analysis group and the EPSRC-funded project PET++.

The research I am involved in is at the interface of numerical analysis, computational science, probability theory, and statistics. It focuses on methods that can be used to blend mathematical models with observational data.

Keywords.
Uncertainty quantificationBayesian inferenceInverse problems
Well-posednessMeasures and integrationMarkov chain Monte Carlo
Particle filtersReal world dataLow-rank approximation
Multilevel methodsHierarchical models(Stochastic) partial differential equations

This webpage contains information about my research output in academic journals and at academic meetings. Moreover, I have summarised my teaching experience and my personal background. Input of any kind is highly appreciated; please contact me.

     

News

Adaptive Multilevel Sparse Leja Approximations for Bayesian inverse problems

Published:

Ionut-Gabriel Farcas (Oden Institute, UT Austin), Elisabeth Ullmann (TUM), Tobias Neckel (TUM), Hans-Joachim Bungartz (TUM) and I have published an article in the SIAM Journal of Scientific Computing (SISC), part A. We propose an efficient, deterministic strategy allowing us to find sparse grid approximations to posterior measures in Bayesian inverse problems. Deterministic strategies tend to be inefficient, if the posterior measure is highly concentrated: finding the area of concentration requires many model evaluations which are infeasible. Our algorithm uses computationally cheap model discretisations (‘multilevel approach’) to find this area of concentration computationally cheaply. We combine this idea with adaptive sparse grids that are constructed with weighted Leja points and address problems such as the separation of posterior measures. Finally, we validate the approach in numerical experiments, considering, e.g., an elliptic inverse problem in a 3D domain. Click here for the journal publication or click here for the preprint version.

Shallow covariance kernels

Published:

My collaborators Daniel Kressner (EPFL), Stefano Massei (EPFL), Elisabeth Ullmann (TUM), and I have just deposited a new manuscript in the arXiv. In “Certified and fast computations with shallow covariance kernels”, we propose a new algorithm for low-rank-approximations of parameter-dependent covariance kernels. We show that the algorithm is certified in terms of the Wasserstein-2 distance between the Gaussian processes given the correct and the approximate covariance kernels, respectively. Moreover, we discuss the robustness of the algorithm and linearisation techniques for covariance kernels.

Moved to Cambridge

Published:

After graduating from my PhD programme last December, I now moved to the University of Cambridge for a postdoc. Here, I will be part of Carola-Bibiane Schönlieb’s Cambridge Image Analysis group. I will be working within the EPSRC project PET++ on new methods for image reconstruction in positron emission tomography (PET).

Minisymposium on Bayesian well-posedness at SIAM UQ20

Published:

My colleague Björn Sprungk (Göttingen) and I had submitted a minisymposium to the SIAM Conference on Uncertainty Quantification, which will take place in the end of March 2020 in Munich, Germany. Our minisymposium has now been approved by the conference’s scientific committee. Its title reads “Would Hadamard have used Bayes’ rule? - On robustness and brittleness of statistical inversion”. We are looking forward to four very interesting talks, which are going to cover: stability and instability of Bayesian inference subject to perturbations in data, model, or prior; consistency subject to large data limits and convergence of Gaussian process surrogates/emulators; and influences of perturbations in MCMC algorithms. Link to the minisymposium.

Reliability analysis with MLS²MC

Published:

Fabian Wagner, Iason Papaioannou, Elisabeth Ullmann and I have deposited a new manuscript in the arXiv: “Multilevel Sequential Importance Sampling for Rare Event Estimation”. In this manuscript, we essentially work towards employing the Multilevel Sequential² Monte Carlo framework for the estimation of rare events rather than Bayesian inversion; see (2018, J. Comput. Phys. 368, p. 154-178). Then, we show in numerical experiments that this approach is reasonable and that it can outperform single level strategies. Moreover, the approach solves nestedness issues that can occur in Subset Simulation algorithms. Last, I want to mention that we consider MCMC algorithms that are based on adaptive von Mises-Fischer-Nakagami proposals. Those appear to be especially useful in the mutation step in Sequential Monte Carlo methods. For more information, please have a look at the article.

Update of the well-posedness paper

Published:

As of tonight, a new version of the manuscript “On the well-posedness of Bayesian inverse problems” is available in the arXiv. Aside from a slight change of the paper’s focus, I have added some new results concerning stability in the Wasserstein distance and considered the case of infinite-dimensional data spaces with additive Gaussian noise. The latter complements the results we have already had for finite-dimensional noise – here, we could already show well-posedness independently of prior and forward model. Additionally, the infinite-dimensional case gives a new connection between this manuscript and the recent article of Christian Kahle, Kei Fong Lam, Elisabeth Ullmann, and me; see (2019, SIAM/ASA J. Uncertain. Quantif. 7(2), p. 526-552).

New webpage online

Published:

I have a new academic webpage and you found it! Please feel free to take a stroll around the side.