Introduction to Information Theory
Program: Bachelor Mathematics, Informatics, AI, Physics & Astronomy (WI, WInf, IN, KI, NS)
Lecturer: Michael Walter
Teaching assistants: Freek Witteveen
Schedule: Sept-Oct 2019
Reading your email on the metro may not strike you as particularly memorable. Isn’t it remarkable, though, that – despite relying on a noisy cellphone signal underground – none of your messages arrive corrupted? In this course you will learn the mathematics of how this is achieved.
This course will give an introduction to information theory – the mathematical theory of information. Ever since its inception in the 50s, information theory has had a profound impact on society. It underpins important technological developments, from reliable memories to mobile phone standards, and its versatile mathematical toolbox has found use in computer science, machine learning, physics, and even pure mathematics.
Starting from probability theory, we will discuss how to mathematically model information sources and communication channels, how to optimally compress and transmit information, and how to design error-correcting codes that allow us to reliably communicate over noisy communication channels. We will also see how techniques used in information theory can be applied more generally to make predictions from noisy data.
We will use rigorous definitions and proofs, hence some “mathematical maturity” will be assumed. Familiarity with basic probability theory will be expected (but we will remind you of the most important facts). In addition, some of the homework problems will require programming. You can use the programming language of your choice; examples and solution will be given in Python.
Lecture Notes and Literature
Lecture notes will be offered on a weekly basis. Main reference is David J. C. MacKay’s textbook “Information Theory, Inference, and Learning Algorithms”, Cambridge (2003), which is available online.
At the end of the course, you will be able to:
- mathematically model information sources and information channels using probability theory
- compute basic properties such as entropy, conditional entropy, mutual information
- define typical sets and estimate their size and probability
- prove and apply Shannon’s source coding theorem
- prove and apply Shannon’s noisy channel coding theorem
- design and analyze concrete symbol codes and stream codes for realistic sources
- design and analyze concrete error-correcting codes for realistic channels
- apply message passing algorithms to decoding problems
- apply information theory techniques to solve simple inference problems
- Introduction to Information Theory
- Probability Theory Refresher
- Entropy and the Source Coding Theorem
- Symbol Codes
- Stream Codes
- Dependent Random Variables and Joint Entropies
- Communication over a Noisy Channel
- The Noisy Channel Coding Theorem
- Error-Correcting Codes and Real Channels
- Interlude on Inference
- Message Passing and Decoding
- Further Topics in Coding Theory or Multi-User Information Theory (as time permits)
- Outlook and Quantum Information Teaser (not part of exam)
Lectures, exercises classes, self-study.
Regular homework and a final written exam. Check back for further detail soon.