Data Compression With Deep Probabilistic Models

Course by Prof. Robert Bamler at University of Tuebingen.

At a Glance

Links

Tentative Schedule & Course Materials

Details of the following schedule are still subject to change. Course materials (videos, lecture notes, problem sets, code) will be made available here once the course starts.

Preview

Find out what you will learn in this course: from the information theoreti­cal foundations of communication to concrete practical implementations of deep learning based compressors.
0
Lecture
19 April 2021

Lossless Compression I: Symbol Codes

How can we compress a sequence of symbols and what do we have to know about the data source?
1
Lecture
20 April
Tutorial
26 April

Theoretical Bounds for Lossless Compression

What is the theoretical bound for lossless compression? We'll encounter a term called “entropy”.
2
Lecture
27 April
Tutorial
3 May

A Primer on Deep
Generative Models

How can we model more complex data sources with deep generative models?
In tutorial: form a team for your group project & discuss suggested topics.
3
Lecture
4 May
Tutorial
10 May

Probability Theory II: Continuous Random Variables

How can we model the arrival time of a late bus? And why can we define only a “differential” entropy of this random variable?
4
Lecture
11 May
Tutorial
17 May

A Primer on Scalable
Bayesian Inference

How can we quantify parameter un­certainty in machine learning models?
Deadline for finalizing team members and topic of your group project.
5
Lecture
18 May
Tutorial
31 May

Lossless Compression II: Stream Codes

We've already learned how to create an optimal symbol code, but can we do better than symbol codes? Yes, if we're willing to think in fractional bits.
6
Lecture
1 June
Tutorial
7 June

Lossless Compression III: Modern Stream Codes

Learn how to “short sell” bits. No, not on the stock exchange. This simple trick recently lead to a fast new stream code for complex models.
7
Lecture
8 June
Tutorial
14 June

The (Noisy-)Channel
Coding Theorem

How fast can we reliably transmit data over an unreliable channel? This gem of information theory will also prepare us for lossy compression.
8
Lecture
15 June
Tutorial
21 June

Lossy Compression I:
Rate-Distortion Theory

How do we quantify lossy compression performance, what's its theoretical bound, and does the channel matter?
Deadline for status report on project.
9
Lecture
22 June
Tutorial
28 June

Lossy Compression II:
Practical Lossy Compression

How do existing classical and machine learning based lossy codecs work?
Instead of tutorial: individual appoint­ments to discuss your group projects.
10
Lecture
29 June

Recent Advances in
Neural Compression

What are the latest trends and open research questions?
Instead of tutorial: continuing with the group appointments.
11
Lecture
6 July

Guest Stars

Two pioneers of neural compression:
Tue, 13 July: Dr. Christopher Schroers from DisneyResearch|Studios in Zurich
Mon, 19 July: Prof. Dr. Stephan Mandt from University of California at Irvine
12
Guest Lectures
13 & 19 July

Tips for Giving Presentations and for Scientific Writing

Presenting your group project next week and finalizing the written report should be a joy, not a pain! These tips might help you.
13
Lecture
20 July

Group Project Presentations

The stage is yours! Let's celebrate your achievements in the group projects with a round of presentations and demos, and with plenty of time for your questions and your feedback.
14
Work­shop
26 & 27 July