## Imperial College London (CO-496)

The aim of the course is to provide the students the necessary mathematical background and skills in order to understand, design and implement modern statistical machine learning methodologies, as well as inference mechanisms. The course will provide examples regarding the use of mathematical tools for the design of basic machine learning and inference methodologies, such as Principal Component Analysis (PCA), Bayesian Regression and Support Vector Machines. The course is co-taught by Stefanos Zafeiriou and Marc Deisenroth.

### Syllabus

- Bayesian Linear Regression
- Vector Calculus (e.g., partial derivatives, chain rule, Jacobian)
- Basic probability distributions (e.g., multivariate Gaussian)
- Bayesâ€™ theorem
- Conjugate priors
- Gradient descent
- Model selection
- Cross validation
- Maximum likelihood estimation
- MAP estimation
- Bayesian integration
- Graphical model notation
- Bayesian linear regression

- Probabilistic PCA
- Eigenvalues
- Determinants
- Basis change
- Singular value decomposition
- Gram-Schmidt orthonormalization
- Rotations
- Projections

- Support Vector Machines
- Constrained optimization
- Lagrange multipliers

### Lectures

- Mondays, 14:00 - 16:00
- Fridays, 11:00 - 13:00

### Course Support Leader

- Eimear O’Sullivan
- Kenneth Co

### Teaching Assistants

TBD

### Resources

- M. P. Deisenroth, A. A. Faisal, C. S. Ong: Mathematics for Machine Learning, Cambridge University Press, 2020

### Essential Preparation

- Basic concepts in Linear Algebra: Chapters 2-4 of this book
- C. M. Bishop: Pattern Recognition and Machine Learning, Springer, 2006: Chapters 1-3, Appendix B, C