Information Theory 2017
Master of Logic
Teaching assistants: Yfke Dulek (email)
Alvaro Piedrafita (email)
News:
6 Feb 2017: first update of pageSign up!
- Create an account on this Moodle site, or log in if you already have an account.
- enrol yourself in the Information Theory course.
- Leave a welcome message in the course forum
- read the details about the organisation of the course.
- Attend the first lecture on Tuesday, 30 October 2017, 9:00 at Science Park C0.05 and subsequent exercise session. Please bring along your laptop or smartphone with access to the Moodle site.
Content of the course
Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.In this course, we quickly review the basics of probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and entropy diagrams. Then, we prove Shannon's theorems about data compression and channel coding. An interesting connection with graph theory is made in the setting of zero-error information theory. We also cover some aspects of information-theoretic security such as perfectly secure encryption.
Requirements
Students are required to know the (theory) contents of the course Basic Probability: Theory in the Master of Logic (no programming will be required for this course). Study the script and the theory homework exercises.Intended Learning Outcomes
At the end of the course, you will be able to solve problems of the following kinds:(Probability)
- compute the probability of an event using the most common discrete probability distributions (Bernoulli, binomial, and geometric);
- compute inverse probabilities using Bayes' rule;
- compute the means and variances of commonly used probability distributions;
- compute the means and variances of sums or products of a random variables with known distributions;
- bound the probability of an extreme event using inequalities such as the Markov bound, Chebyshev's inequality, or Hoeffding's inequality.
(Entropy and related concepts)
- compute the entropy of a random variable;
- compute the mutual information between two random variables;
- use entropy diagrams to reason about the relative size of the entropies, conditional entropies, and mutual information of two or three random variables;
- use Jensen's inequality to bound the mean of a random variable defined in terms of convex or concave function of another random variable.
(Data compression)
- construct a d-ary Huffman code for a random variable.
- use Kraft's inequality to check whether a prefix-free code can be constructed to fit certain codeword lengths;
- bound the possible rate of lossless compression of output from a given source using Shannon's source coding theorem;
- define a typical set and reason about its size, probability, and elements;
- compute the Shannon-Fano-Elias codeword for a sample from a stochastic process.
- compute the entropy rate of a Markov process.
(Noisy-channel coding)
- construct a probability model of a communication channel given a verbal description;
- compute the channel capacity of a channel;
- use Shannon's channel coding theorem to bound the achievable rate of reliable communication over a channel;
- use Bayes' rule to decode corrupted messages sent using an error-correcting code;
- evaluate the rate and reliability of such codes;
- define the jointly typical sets of a source and channel, and use such sets to decode outputs from the channel;
- draw the confusability graph of a given channel, and describe the channel depicted by a given confusability graph;
- compute the independence number and the zero-error capacity of a confusability graph;
Course website
Updated information about the course can be found on https://staff.science.uva.nl/c.schaffner/courses/inftheory/2016/Study Material
The material will be presented in black-boards lectures. The following are good references:- Lecture Notes by Yfke Dulek and Christian Schaffner
- [CF] Ronald Cramer, Serge Fehr: The Mathematical Theory of Information, and Applications, lecture notes, Version 2.0
- [CT] Thomas M. Cover, Joy A. Thomas. Elements of information theory, 2nd Edition. New York: Wiley-Interscience, 2006. ISBN 0-471-24195-4.
- [MacKay] David J. C. MacKay. Information Theory, Inference, and Learning Algorithms. Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1
Lectures and Exercise sessions
please check Datanose for the definite times and locations.Homework, exam, and grading
This is a 6 ECTS course, which comes to roughly 20 hours of work per week.There will be homework exercises every week to be handed in one week later. The answers should be in English. Feel free to use LaTeX, here is a template to get you started, but readable handwritten solutions are fine, too. Cooperation while solving the exercises is allowed and encouraged, but everyone has to hand in their own solution set in their own words.
There will be a final written exam. The exam is open-book, meaning that you can bring the study material [CF], [CT], [MacKay] mentioned above as well as any notes you made, but no electronic devices are allowed. Here is the written exam from Spring 2014 and Fall 2015.
The final grade for the course consists by 10% of the average grade in the online multiple-choice concept questions (asked after every lecture), 40% of the average homework grade (ignoring the worst grade) and 50% of the grade obtained at the final exam.
Life after "Information Theory"
If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography:- Talk to Christian about the possibilities of doing a semester project or master project in information theory or cryptography. He can also hook you up with other people at the ILLC, at CWI or in the rest of the world, working on different aspects of information theory.
- Follow Ronald de Wolf's course about quantum computing in the national MasterMath curriculum, taught at the university of Amsterdam, starting Spring 2017.
- Follow Leen Torenvliet's course about Kolmogorov complexity at the university of Amsterdam, starting Spring 2017.
- Follow various online classes such as the edx course on quantum cryptography by Stephanie Wehner and Thomas Vidick. Dan Boneh's crypto, crypto II, Jon Katz's crypto class or Umesh Vazirani's course about quantum computing, or a more advanced MIT course on quantum information theory by Isaac Chuang.