| CARVIEW |
COMS30035 - Machine Learning
News
Unit Information
This unit seeks to acquaint students with machine learning algorithms which are important in many modern data and computer science applications. We cover topics such as kernel machines, probabilistic inference, neural networks, HMMs and emsemble models.
Staff
James Cussens , Xiyue Zhang , Wei-Hong Li , Xiang Li
Teaching Assistants
Siddhant Bansal, Omar Emara, Kal Roberts, Jonathan Erskine, Enrique Crespo Fernandez, Zhiyuan Xu
Unit Materials
All lectures will be given by James Cussens, apart from:
- L05: Introduction to Neural Networks, lecturer is Xiyue Zhang
- L06: Training Neural Networks, lecturer is Wei-Hong Li
- L07: Regression and classification trees, lecturer is Xiang Li
| Week | 1st lecture, all on Monday, all in Queen's 1.4 Pugsley. Week 1 at 1500-1600, all other weeks at 0900-1000 | 2nd lecture, Week 1 on Tuesday, in Chem LT3 at 1200-1300, all other weeks on Monday, in Queen's 1.4 Pugsley at 1500-1600 | Tuesday lab, 1500-1800, MVB 2.11 | Thursday drop-in, 1700-1800, Queen's 1.06 |
| 1 (w/c 22/09/25) | L01: Unit organisation, L02: Machine Learning concepts | L03: Unit topic overview, L04: Linear regression, linear discriminant and logistic regression | Lab01: Introduction to numpy and scikit learn | Drop-in |
| 2 (w/c 29/09/25) | L05: Introduction to Neural Networks | L06: Training Neural Networks | Lab02: Neural networks | Drop-in |
| 3 (w/c 06/10/25) | L07: Regression and classification trees | L08: Kernels and Support Vector Machines | Lab03: Trees and SVMs | Drop-in |
| 4 (w/c 13/10/25) | L09: Probabilistic Graphical Models | L10: Markov Chain Monte Carlo | Lab04: Probabilistic Graphical Models | Drop-in |
| 5 (w/c 20/10/25) | L11: k-means and mixtures of Gaussians | L12: The EM algorithm | Lab05: Mixture models, K-means and Expectation Maximisation | Drop-in |
| 6 (w/c 27/10/25) | No lecture | No lecture | No lab | No drop-in |
| 7 (w/c 03/11/25) | L13: Sequential data (HMMs) | L14: Sequential data (LDS) | Lab06: Hidden Markov Models | Drop-in |
| 8 (w/c 10/11/25) | L15: Ensemble methods | Spare lecture | Lab07: Decision Trees and Ensemble Methods | Drop-in |
| 9 (w/c 17/11/25) | No lecture | No lecture | Coursework support session (1500-1700) | No drop-in |
| 10 (w/c 24/11/25) | No lecture | No lecture | Coursework support session (1500-1700) | No drop-in |
| 11 (w/c 01/12/25) | No lecture | No lecture | Coursework support session (1500-1700) | No drop-in |
| 12 (w/c 08/12/25) | No lecture | Revision | No lab | No drop-in |
Assessment Details
- If you are doing this unit as a MAJOR (assessment unit COMS30083) then you do a mid-term in-class test Wednesday, 29 Oct, 1300-1400, MVB 1.07, MVB 1.08 and QB 1.80 (worth 30% of the unit) and also coursework (worth 70% of the unit). Coursework tasks are released at 1300 on Thursday 13 Nov 2025 and students' coursework is to be submitted by 1300 on Thursday 4 Dec 2025. Feedback and marks for coursework will be released on 6 Jan 2026.
- If you are doing this unit as a MINOR (assessment unit COMS30081) then you are assessed by the ML questions in the "Topics in Computer Science" closed-book exam in December (worth 100% of the unit).
Lab Work
The labs are formative assessements which we strongly encourage you to complete. Note that these were designed to help your understanding of ML methods.Using the machines in MVB 2.11
If you are using UoB machines to do the lab exercises (as opposed to using your own machine) you need to do the following.
- Make sure the machine you are using is running Linux (reboot if necessary).
- Open up a Terminal window so you have access to the Linux command line.
-
Enter the following at the command line:
module load anaconda/3-2025
Using your own machine
If you want to do lab exercises on your own machine then you should install Anaconda on it. If you run into installation problems then feel free to ask the Teaching Staff on the unit for help, but we can't guarantee to solve them.
Text books
- Bishop, C. M., Pattern recognition and machine learning (2006). This is one of the best ML textbooks and will be our main textbook. The book is freely available here.
- Murphy, K., Probabilistic Machine Learning: An Introduction (2022) and Murphy, K., Probabilistic Machine Learning: Advanced Topics (2023). We will also use this book series for some topics. These are more recent textbooks and provide a particularly good coverage of probabilistic methods. The books are freely available via here.
Github
All technical resources (including the labs) will be posted on the COMS30035 Github organisation. If you find any issues, please kindly raise an issue in the respective repository.