Course by Prof. Robert Bamler at University of Tuebingen.

**Mondays 16:15-17:45**and**Tuesdays 12:15-13:45**on zoom.**First lecture:**Monday, 19 April;**after that, lectures will be on Tuesdays**, see detailed tentative schedule below.**6 ECTS**with grade based on group project (you may skip the group project if you don't need the ECTS).- To encourage interactivity, neither the lectures nor the tutorials will be recorded. However, I'm providing supplementary video material that repeats important concepts discussed in the lectures in a dedicated YouTube playlist.

Details of the following schedule are still subject to change. Course materials (videos, lecture notes, problem sets, code) will be made available here once the course starts.

Find out what you will learn in this course: from the information theoretical foundations of
communication
to
concrete practical implementations of deep learning based compressors.

0

Lecture

19 April 2021

How can we compress a sequence of symbols and what do we have to know about the data source?

1

Lecture

20 April

Tutorial

26 April

What is the theoretical bound for lossless compression? We'll encounter a term
called “entropy”.

2

Lecture

27 April

Tutorial

3 May

Generative Models

How can we model more complex data sources with deep generative models?

3

Lecture

4 May

Tutorial

10 May

How can we model the arrival time of a late bus? And why can we define only a “differential” entropy of this
random variable?

4

Lecture

11 May

Tutorial

17 May

Bayesian Inference

How can we quantify parameter uncertainty in machine learning models?

5

Lecture

18 May

Tutorial

31 May

We've already learned how to create an optimal symbol code, but can we do better than symbol codes? Yes, if
we're willing to think in fractional bits.

6

Lecture

1 June

Tutorial

7 June

Learn how to “short sell” bits. No, not on the stock exchange. This simple trick recently lead to a fast new
stream code for complex models.

7

Lecture

8 June

Tutorial

14 June

Coding Theorem

How fast can we reliably transmit data over an unreliable channel? This gem of information theory will also
prepare us for lossy compression.

8

Lecture

15 June

Tutorial

21 June

Rate-Distortion Theory

How do we quantify lossy compression performance, what's its theoretical bound, and does the channel matter?

9

Lecture

22 June

Tutorial

28 June

Practical Lossy Compression

How do existing classical and machine learning based lossy codecs work?

10

Lecture

29 June

Neural Compression

What are the latest trends and open research questions?

11

Lecture

6 July

Two pioneers of neural compression:

12

Guest Lectures

13 & 19 July

Presenting your group project next week and finalizing the written report should be a joy, not a pain! These
tips might help you.

13

Lecture

20 July

The stage is yours! Let's celebrate your achievements in the group projects with a round of presentations and
demos, and with plenty of time for your questions and your feedback.

14

Workshop

26 & 27 July