Introduction to Information Theory

Program: Bachelor Mathematics, Informatics, AI, Physics & Astronomy (WI, WInf, IN, KI, NS)
Lecturer: Michael Walter
Teaching assistants: Freek Witteveen
Schedule: Sept-Oct 2019

Course description

Reading your email on the metro may not strike you as particularly memorable. Isn’t it remarkable, though, that – despite relying on a noisy cellphone signal underground – none of your messages arrive corrupted? In this course you will learn the mathematics of how this is achieved.

This course will give an introduction to information theory – the mathematical theory of information. Ever since its inception in the 50s, information theory has had a profound impact on society. It underpins important technological developments, from reliable memories to mobile phone standards, and its versatile mathematical toolbox has found use in computer science, machine learning, physics, and even pure mathematics.

Starting from probability theory, we will discuss how to mathematically model information sources and communication channels, how to optimally compress and transmit information, and how to design error-correcting codes that allow us to reliably communicate over noisy communication channels. We will also see how techniques used in information theory can be applied more generally to make predictions from noisy data.


We will use rigorous definitions and proofs, hence some “mathematical maturity” will be assumed. Familiarity with basic probability theory will be expected (but we will remind you of the most important facts). In addition, some of the homework problems will require programming. You can use the programming language of your choice; examples and solution will be given in Python.

Lecture Notes and Literature

Lecture notes will be offered on a weekly basis. Main reference is David J. C. MacKay’s textbook “Information Theory, Inference, and Learning Algorithms”, Cambridge (2003), which is available online.

Learning Objectives

At the end of the course, you will be able to:

Tentative Syllabus

  1. Introduction to Information Theory
  2. Probability Theory Refresher
  3. Entropy and the Source Coding Theorem
  4. Symbol Codes
  5. Stream Codes
  6. Dependent Random Variables and Joint Entropies
  7. Communication over a Noisy Channel
  8. The Noisy Channel Coding Theorem
  9. Error-Correcting Codes and Real Channels
  10. Interlude on Inference
  11. Message Passing and Decoding
  12. Further Topics in Coding Theory or Multi-User Information Theory (as time permits)
  13. Outlook and Quantum Information Teaser (not part of exam)


Lectures, exercises classes, self-study.


Regular homework and a final written exam. Check back for further detail soon.