Thanks to visit codestin.com
Credit goes to github.com

Skip to content

bmlip/course

Repository files navigation

5SSD0 Course Homepage, 2025-2026 edition

Bayesian Machine Learning and Information Processing (at TU Eindhoven)

Course goals

This course provides an introduction to Bayesian machine learning and information processing systems. The Bayesian approach affords a unified and consistent treatment of many useful information processing systems.

Course summary

This course covers the fundamentals of a Bayesian (i.e., probabilistic) approach to machine learning and information processing systems. The Bayesian approach provides a unified, consistent framework for many model-based machine learning techniques.

Initially, we focus on Linear Gaussian systems and will discuss many useful models and applications, including common regression and classification methods, Gaussian mixture models, hidden Markov models, and Kalman filters. We will discuss essential algorithms for parameter estimation in these models, including the Variational Bayes method.

The Bayesian method also provides tools for comparing the performance of different information processing systems by means of estimating the Bayesian evidence for each model. We will discuss several methods for approximating Bayesian evidence.

Next, we will discuss intelligent agents that learn purposeful behavior through interactions with their environment. These agents are used in applications such as self-driving cars and the interactive design of virtual and augmented realities.

Indeed, in this course, we relate synthetic Bayesian intelligent agents to natural intelligent agents such as the brain. You will be challenged to code Bayesian machine learning algorithms yourself and apply them to practical information processing problems.

News and Announcements

  • (12-Nov-2025) Please sign up for Piazza (Q&A platform) at signup link. As much as possible, we will use the Piazza site for new announcements as well.

Instructors

Materials

All course materials are available in the table below. If necessary, you can download the lecture notes in PDF format here:

We recommend that you read the lecture notes in your browser to take advantage of the interactive materials that we prepared for this course, based on Pluto.jl.

Books

The following (freely downloadable) book is optional but very useful for additional reading:

Software

Please follow the software installation instructions. If you encounter any problems, please get in touch with us in class or on Piazza.

You can access all lecture materials online through the links below:

Date lesson materials
lecture notes assignments video recordings (2023/24)
12-Nov-2025 (Wed) ⚪️ B0: Course Syllabus
⚪️ B1: Machine Learning Overview
B0, B1 B0, B1
14-Nov-2025 (Fri) ⚪️ B2: Probability Theory Review B2 B2.1, B2.2
19-Nov-2025 (Wed) ⚪️ B3: Bayesian Machine Learning B3 B3.1, B3.2
21-Nov-2025 (Fri) ⚪️ B4: Factor Graphs and the Sum-Product Algorithm B4 B4.1, B4.2
26-Nov-2025 (Wed) 🟢 Introduction to Julia W0
28-Nov-2025 (Fri) 🔴 Pick-up Julia programming assignment A0 A0
28-Nov-2025 (Fri) ⚪️ B5: Continuous Data and the Gaussian Distribution B5 B5.1, B5.2
03-Dec-2025 (Wed) ⚪️ B6: Discrete Data and the Multinomial Distribution B6 B6
05-Dec-2025 (Fri) 🟢 Probabilistic Programming 1 - Bayesian inference with conjugate models W1 W1.1, W1.2
05-Dec-2025 🔴 Submission deadline assignment A0 submit
05-Dec-2025 🔴 Pick-up probabilistic programming assignment A1 A1
10-Dec-2025 (Wed) ⚪️ B7: Regression B7 B7.1, B7.2
12-Dec-2025 (Fri) ⚪️ B8: Generative Classification
⚪️ B9: Discriminative Classification
B8, B9 B8, B9
17-Dec-2025 (Wed) 🟢 Probabilistic Programming 2 - Bayesian regression & classification W2 W2.1, W2.2
19-Dec-2025 (Fri) ⚪️ B10: Latent Variable Models and Variational Bayes B10 B10.1, B10.2
19-Dec-2025 🔴 Submission deadline assignment A1 submit
🔵 break
07-Jan-2026 (Wed) 🟢 Probabilistic Programming 3 - Variational Bayesian inference W3 W3.1, W3.2
09-Jan-2026 (Fri) ⚪️ B11: Dynamic Models B11 B11
09-Jan-2026 🔴 Pick-up probabilistic programming assignment A2 A2
14-Jan-2026 (Wed) ⚪️ B12: Intelligent Agents and Active Inference B12,
slides
B12.1, B12.2
16-Jan-2026 (Fri) 🟢 Probabilistic Programming 4 - Bayesian filters & smoothers W4 W4.1, W4.2
23-Jan-2026 (Fri) 🔴 Submission deadline assignment A2 submit
29-Jan-2026 (Thu) 🔵 written examination (13:30-16:30)
- 🔴 Pick-up resit programming assignment download
- 🔴 Submission deadline resit assignment submit
- 🔵 resit written examination (18:00-21:00)

Exams & Assignments

Exam Rules

  • You can not bring a formula sheet, nor use a phone or calculator at the exam. This Formula Sheet will be provided in the preamble of the exam. You can use the formula sheet when making any exercises.

Exam Preparation

  • The written exam will be a multiple-choice exam, just like the examples below. This year, there will be no probabilistic programming question in the written exam.

  • In addition to the materials in the above table, we provide two representative practice written exams:

Programming Assignments

  • Programming assignments can be downloaded and submitted through the links in the above table.

Grading

  • The final grade is composed of the results of assignments A1 (10%), A2 (10%), and the final written exam (80%). The grade will be rounded to the nearest integer.















Behind the scenes

For instructors:

Important

The Pluto notebooks in this repository (.jl files) are automatically rendered on our website. You can view them online at https://bmlip.github.io/course/, and copy URLs from this index to use in the course schedule.

Status for live interactivity (PlutoSliderServer): Better Stack Badge

How to modify lecture materials

Take a look at https://github.com/bmlip/course/tree/main/developer%20instructions for more information aimed at the course lecturers and website admins.