BCAM-IMUVA Summer School on Uncertainty Quantification for Applied Problems

July 4-7, 2016, Bilbao (Basque Country, Spain)


  • Scalable Bayesian Inference with Hamiltonian Monte Carlo and Stan       Slides 1   Slides 2   Slides 3   Slides 4
    Dr. Michael Betancourt, University of Warwick (United Kingdom)

    Abstract: Bayesian inference is a powerful framework for first building and then fitting statistical models, but successfully applying this approach in practice is far from trivial. In these lectures I will present the theoretical foundations of Bayesian inference, the challenges that arise in its implementation, and practical solutions to these problems. The material will cover both techniques for building complex models and computational methods for fitting those models, especially Markov chain Monte Carlo and Hamiltonian Monte Carlo. Finally I will introduce Stan, a computational library that complements probabilistic programming with state-of-the-art algorithms to facilitate the use of Bayesian inference in difficult statistical analyses.

  • Algorithms for UQ for differential equations       Slides
    Prof. Max Gunzburger, Florida State University (USA)

    Abstract: We review numerical methods for the approximation of statistical quantities of interest that depend on the solution of partial differential equations having parameterized random inputs, including inputs that are approximations of random fields. We focus on methodologies for building stochastic surrogates or stochastic quadrature rules for the outputs of interest. We review several sampling techniques, starting, of course, with Monte Carlo approaches and ending with stochastic collocation methods. We also review approximations based on globally supported polynomial bases such as polynomial chaos methods.

  • Variational Assimilation and Uncertainty Quantification       Slides 1   Slides 2
    Prof. Olivier Talagrand, Laboratoire de Météorologie Dynamique, École Normale Supérieure, Paris (France) and Mohamed Jardak, Meteorological Office, Exeter (United Kingdom)

    Abstract: Assimilation of observations originated from the need of defining initial conditions to numerical weather forecasts. Two different sources of information have to be combined for that purpose. The observations are on the one hand, and the physical laws that govern the evolution of the atmosphere are on the other. The latter are available in the form of a discretized numerical model. Uncertainty is present in those pieces of information, and a consistent approach is to consider assimilation as a problem in Bayesian estimation, viz., determine the probability distribution for the state of the observed system, conditioned by everything that is known. Bayesian estimation can be fully solved when the data are linearly related to the unknowns and are affected by additive Gaussian errors. This has led to two classes of algorithms for assimilation, which exactly perform Bayesian estimation in the linear and Gaussian case, and have been heuristically extended to moderately nonlinear and non-Gaussian situations. One is Variational Assimilation, which minimizes a scalar function that measures the misfit over time between data and estimated values. The other is Ensemble Kalman Filter, which evolves over time an ensemble of points in state space. Variational assimilation can be implemented in an ensemble (and therefore explicitly probabilistic) form. Perturb the data according to their own error probability distribution, and perform a standard, deterministic, variational assimilation for each set of perturbed data. In the linear and Gaussian case, this leads to an ensemble of independent realizations of the (Gaussian) Bayesian probability distribution. That approach, called EnsVAR, has been implemented on two small-dimension nonlinear chaotic systems (the Lorenz 96 system and the Kuramoto-Sivashinsky equation). As far as the Bayesian character of the obtained ensembles can be objectively evaluated, it is as good as in the exactly linear and Gaussian case. Other aspects of interest for Uncertainty Quantification in the context of observation and prediction of geophysical flows will be discussed: ensemble prediction, objective validation of systems that quantitatively estimate uncertainty, and observability of the atmosphere or the ocean.

  • Probabilistic and ensemble approaches to Data Assimilation       Slides 1   Slides 2
    Prof. Peter Jan van Leeuwen, University of Reading (United Kingdom)

    Abstract: Data assimilation is the science of combining large observational data sets with high-dimensional numerical models of geophysical systems. It is Bayes inference on very-high dimensional geophysical systems. The most well-known use of data assimilation is numerical weather prediction, but it is used in several other fields of the geosciences and beyond, e.g. ocean forecasting, climate forecasting, forecasting the land surface, biology, ecology, neuroscience, you name it.
    Due to the high dimensions we cannot solve Bayes Theorem in full, and have to rely on approximations. A standard solution is to use Monte-Carlo methods, but because the dimension of the application is so large we can only perform a small number O(10-100) of model runs, so we only have 10 to 100 Monte-Carlo samples. This is why e.g. Metropolis-Hastings-like algorithms are unfeasible.
    A common approximation is to assume that the pdfs are close to Gaussian, leading to so-called Ensemble Kalman Filters. The ideas behind these will be discussed, as well as practical implementation issues like localisation and inflation. While these methods are now used routinely in the geosciences, the Gaussian approximation is often too limiting, and fully nonlinear methods should be used.
    We will discuss a recent breakthrough in so-called Particle Filters that makes them extremely efficient even in very high dimensional systems, thus curing the curse of dimensionality.

Organized by         Universidad de Valladolid     Universidad de Valladolid     Universidad de Valladolid     IMUVA