Research Seminar "Machine Learning Theory"

This is the research seminar by Ulrike's group.

When and where

Each thursday 14:00 - 15:00, Seminar room 3rd floor, MvL6.


Most sessions take place in form of a reading group: everybody reads the assigned paper before the meeting. Then we jointly discuss the paper in the meeting. Sometimes we also have talks by guests or members of the group.


Everybody who is interested in machine learning theory: Students, PhD students and researchers of the University of Tübingen. We do not mind people dropping in and out depending on whether they find the current session interesting or not.

Upcoming meetings

  • 19.5.2022 No meeting ??? (Neurips deadline)
  • 26.5. No meeting (public holiday)
  • 2.6.2022 Talk by Damien Garreau (France): Title: How to scale hyperparameters for quickshift image segmentation Abstract: Quickshift is a popular algorithm for image segmentation, used as a preprocessing step in many applications. Unfortunately, it is quite challenging to understand the hyperparameters’ influence on the number and shape of superpixels produced by the method. In this paper, we study theoretically a slightly modified version of the quickshift algorithm, with a particular emphasis on homogeneous image patches with i.i.d. pixel noise and sharp boundaries between such patches. Leveraging this analysis, we derive a simple heuristic to scale quickshift hyperparameters with respect to the image size, which we check empirically. Link to the paper
  • 9.6.2022 No meeting (Pfingstferien)
  • 16.6.2022 No meeting (Public holiday)
  • 23.6.2022 tba
  • 30.6.2022 tba
  • 7.7.2022 tba
  • 14.7.2022 tba
  • 21.7.2022 tba
  • 28.7.2022 tba
  • No meetings in August/September

Past meetings

Listed here.

Suggested papers for future meetings

Feel free to make suggestions!
If you do, please (i) try to select short conference papers rather than 40-page-journal papers; (ii) please put your name when entering suggestions; it does not mean that you need to present it, but then we can judge where it comes from; (iii) Please provide a link, not just a title.
  • The Curse Revisited: When are Distances Informative for the Ground Truth in Noisy High-Dimensional Data? (Moritz) (pdf)
  • Loss as the Inconsistency of a Probabilistic Dependency Graph: Choose your Model, not your Loss Function (Moritz) (pdf)