- UCAS course code
- G104
- UCAS institution code
- M20
Master of Mathematics (MMath)
MMath Mathematics
- Typical A-level offer: A*AA including specific subjects
- Typical contextual A-level offer: A*AB including specific subjects
- Refugee/care-experienced offer: A*BB including specific subjects
- Typical International Baccalaureate offer: 37 points overall with 7,6,6 at HL, including specific requirements
Course unit details:
Bayesian Statistics
Unit code | MATH48221 |
---|---|
Credit rating | 15 |
Unit level | Level 4 |
Teaching period(s) | Semester 1 |
Available as a free choice unit? | No |
Overview
The unit aims to introduce students to the fundamentals of Bayesian inference and the computational techniques used to apply it in data analysis and model evaluation.
Pre/co-requisites
Unit title | Unit code | Requirement type | Description |
---|---|---|---|
Probability and Statistics 2 | MATH27720 | Pre-Requisite | Compulsory |
Aims
The unit aims to:
Introduce students to the fundamentals of Bayesian inference and the computational techniques used to apply it in data analysis and model evaluation.
Learning outcomes
On the successful completion of the course, students will be able to:
- Derive posterior distributions for exact Bayesian inference and make inferences/predictions based on these posteriors;
- Apply various computational algorithms to obtain samples from complex posterior distributions and for parameter estimation;
- Describe the various algorithms in words but also implement key algorithms through statistical software;
- Make informed choices among available algorithms for practical data analysis;
- Solve statistical modelling and inference problems within the Bayesian paradigm.
Syllabus
Part A – Foundation of Bayesian Inference
- Bayesian inference concepts: single and multiple parameter prior and posterior distributions; conjugacy and non-conjugacy; Bayesian estimators; credible intervals.
- Model checking & model comparison: Posterior predictive distribution; Bayesian forecasting; model comparison based on predictive performance; model comparison criteria such as Bayes factors, BIC and DIC; Bayesian Decision Theory; Laplace's approximation.
Part B - Computational Bayesian Statistics
- Gibbs Sampler: data augmentation; burn-in; convergence.
- Metropolis-Hasting’s algorithm: independent sampler; random walk Metropolis; scaling; multi-modality.
- MCMC Issues: Monte Carlo Error (batch means/window estimates for MCSE); reparameterization; hybrid algorithms; convergence diagnostics for single/multiple chains.
- Hamiltonian Monte Carlo.
- Approximate Bayesian Inference.
Teaching and learning methods
Teaching is composed of two hours of lectures and one example/computer class per week. Teaching materials will be made available online for reference and review.
Assessment methods
Method | Weight |
---|---|
Other | 30% |
Written exam | 70% |
Coursework: 1 x CW assignment on computational aspects of Bayesian statistics - weighted 30%
Exam: 3 hours - weighted 70%
Recommended reading
Christensen, R., Johnson, W., Branscum, A., & Hanson, T. E. (2010). Bayesian ideas and data analysis: an introduction for scientists and statisticians. CRC press.
Heard, N. (2021). An introduction to Bayesian inference, methods and computation. Cham: Springer.
McElreath, R. (2020). Statistical rethinking: A Bayesian course with examples in R and Stan, (2nd edn). Chapman and Hall/CRC.
Wang, X., Yue, Y. R., & Faraway, J. J. (2018). Bayesian regression modeling with INLA. Chapman and Hall/CRC.
Study hours
Scheduled activity hours | |
---|---|
Lectures | 22 |
Tutorials | 11 |
Independent study hours | |
---|---|
Independent study | 117 |
Teaching staff
Staff member | Role |
---|---|
Taban Baghfalaki | Unit coordinator |