Information Theory 2014

News:

30 Nov 2014: Philip added some more topics to the list of topics for the final presentations.
26 Nov 2014: The final exam will consist of student presentations. Check the list of topics here.
27 Oct 2014: Good reasons to put away your mobile phone during class
See here for the Spring 2014 edition of this course.

Content of the course

Information theory was developed by Claude E. Shannon in the 1950s to investigate the fundamental limits on signal-processing operations such as compressing data and on reliably storing and communicating data. These tasks have turned out to be fundamental for all of computer science.

In this course, we quickly review the basics of (discrete) probability theory and introduce concepts such as (conditional) Shannon entropy, mutual information and Renyi entropy. Then, we prove Shannon's theorems about data compression and channel coding. In the course, we also cover some aspects of information-theoretic security and show applications of information theory to the area of machine learning.

Intended Learning Outcomes

At the end of the course, you are able to

Course website

Updated information about the course can be found on https://staff.science.uva.nl/c.schaffner/courses/inftheory/2014/

Study Material

The material will be presented in black-boards lectures. The following are good references:

Lectures and Exercise sessions (2 x 45min each)

please check Datanose for the definite times and locations.

Homework, exam, and grading

This is a 6 ECTS course, which comes to roughly 20 hours of work per week.

There will be homework exercises every week to be handed in one week later. The answers should be in English. Feel free to use LaTeX, here is a template to get you started, but readable handwritten solutions are fine, too. Cooperation while solving the exercises is allowed and encouraged, but everyone has to hand in their own solution set in their own words.

The final exam will consist of student presentations about slightly more advanced topics connected to the course. The detailed procedure and list of topics can be found here.

The final grade for the course consists by 1/2 of the average homework grade (ignoring the worst grade) and 1/2 of the grade obtained at the final exam.

Course schedule Fall 2014

Date Content Homework
Mon, 27 Oct

Overview of the course, Probability Theory

Section 2.1 of [CF]

Slides #1

Blackboard Photo 1   Photo 2   Photo 3

Wed, 29 Oct

Shannon Entropy, Jensen's inequality, Properties of Shannon entropy

Sections 3.1-3.3 of [CF]

Slides #2

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6

Exercises #1
Mon, 3 Nov

Chain Rule, Mutual Information, Entropy Diagrams, Markov Chains

Sections 3.3, 3.4 of [CF], Sections 2.5, 2.8 of [CT]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6

Wed, 5 Nov

Data-Processing Inequality, Sufficient Statistics, Perfectly Secure Encryption: One-time pad

Sections 2.8-2.9 of [CT], Section 4 of [CF]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5

Exercises #2
Mon, 10 Nov

Perfectly Secure Encryption: Shannon's theorem. Fano's inequality. Data compression / Source coding

Section 4 of [CF], Sections 2.10 and 3.1 of [CT]

Insecurity of Key Reuse in OTP

Entropy of Alice in Wonderland  Hex Editor with statistics 

Blackboard Photo 1   Photo 2   Photo 3   Photo 4

Wed, 12 Nov

Data Compression: Asymptotic Equipartition Property, Typical set, Source-coding Theorem, high-probability set

Section 3.2+3.3 of [CT]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5

Slides #6

Homework #3
Mon, 17 Nov

Data Compression: symbol codes, properties, source-coding theorem reloaded, Kraft's inequality

Section 5 of [CF], Chapter 5 of [CT], Chapter 5 of [MacKay]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4

Slides #7

Wed, 19 Nov

Huffman codes and their optimality, Game of 20 Questions, Arithmetic Codes

Section 5.4+5.5 of [CF], Section 6.2 of [MacKay]

Online Game of 20 questions

Slides from Mathias Madsen's course on information theory

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6

Homework #4
Mon, 24 Nov

Advantages of Arithmetic Codes, Noisy-Channel Coding, Basic Definitions, Graph Theory

Pages 4+5 of this survey paper

Section 7.5 of [CT]

Dasher   Petersen graph

Blackboard Photo 1   Photo 2   Photo 3   Photo 4  

Wed, 26 Nov

Zero-error channel coding

Frucht graph

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6

Homework #5
Mon, 1 Dec

Noisy-channel coding: capacity, set-up and proof of converse

Sections 7.1, 7.4, 7.5, 7.9, 7.12 of [CT]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4  

Wed, 3 Dec

Shannon's theorem about noisy-channel coding

Sections 7.6, 7.7 of [CT], Chapter 10 of [MacKay]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4   Photo 5   Photo 6

Slides #12

Homework #6
Mon, 8 Dec

Source-channel Separation, Error-correcting codes

Section 7.13 of [CT], Chapter 1 of [MacKay]

Blackboard Photo 1   Photo 2   Photo 3   Photo 4  

Slides #13 Slides #13b

Wed, 10 Dec

Philip Schulz about Information Theory in Machine Translation

Homework #7
Wed, 17 Dec, 10:00-12:30
10:00 - 10:30Almudena: Uniqueness of the uncertainty measureSlidesPDF
10:30 - 11:00Fangzhou: Kolmogorov Complexity ISlidesPDF
11:15 - 11:45Jesus: Kolmogorov Complexity IISlidesPDF
11:45 - 12:15Giulio: GamblingSlidesPDF

Location: CWI, ground floor of the new wing, room L0.17

Thu, 18 Dec, 13:00-15:00
13:00 - 13:30Arianna: Wald's InequalitySlidesPDF
13:30 - 14:00Philip: Expectation-based syntactic comprehensionSlidesPDF
14:15 - 14:45Davide: Code BreakingSlidesPDF

Life after "Information Theory"

If you got hooked on the world of entropies, you have several options after the course to pursue the topics of information theory and cryptography: