MATH654: Computational Intensive Methods 

Credits: 10

Tutor: Pete Neal

Outline: This module will cover:

  • Expectation-Maximisation (EM algorithm).
  • Discrete Markov chains, reversibility, ergodicity and stationary distributions.
  • Markov chains with a continuous state-space.
  • Bayesian inference and conjugacy; conditional conjugacy and Gibbs sampling.
  • Hierarchical models.
  • Data augmentation (auxiliary variables).
  • The Metropolis-Hastings algorithm. Convergence and efficiency issues.
  • Monte Carlo estimation.
  • Independence sampler, random walk Metropolis and Metropolis-within-Gibbs.

Objectives: The first week of the course introduces students to the Expectation-Maximisation (EM) algorithm, an iterative algorithm for obtaining the maximum likelihood estimate of parameters in problems with intractable likelihoods. The remainder of the course introduces the use of Markov chain Monte Carlo (MCMC) methods as a powerful technique for performing Bayesian inference on complex stochastic models. The course will introduce students to the Metropolis-Hastings algorithm and in particular to special cases such as the Gibbs sampler, independence sampler and random walk Metropolis. The introduction of the methods will be closely integrated with Bayesian modelling techniques such as hierarchical modelling, random effects and mixture modelling. A common theme running through both the discussion of the EM and MCMC algorithms will be data augmentation.

Learning Outcomes: On successful completion of this module students will be able to:

  • understand the priniciples of the EM algorithm;
  • Construct and implement the EM algorithm for a range of statistical problems.
  • Perform a range of computationally intensive methods for statistical inference using the statistical package R.

Core texts:

  • D. Gamerman and H Lopes (2006) Markov chain Monte Carlo: Stochastic simulation for Bayesian inference.
  • Chapman and Hall, Second Edition. W.R. Gilks, S. Richardson and D. Spiegelhalter (1996). Markov chain Monte Carlo methods in Practice.
  • Chapman and Hall. P. Hoff (2009). A first course in Bayesian Statistical Methods. Springer.
  • W. Bolstad (2010) Understanding computational Bayesian Statistics. Wiley.

Assessment: Assessment will be through a combination of summer exam (80%) and end of module test (20%).

Contact hours: There will be a mixture of lectures, tutorials and computer workshops totalling approximately 25 hours contact. In addition, private study will make up the majority of the learning hours.

Click here to return to the MRes Programme Content page.