This website has moved! We're now here: Theoretical Computer Science at the University of Tartu

Our lunches are intended to be a low-stress high-fun gatherings.  Bring your salad or sandwich!

Mon 14-16

Every 1st Monday of the month: Joint Theoretical Computer Science Pizza Seminar (all of TCS)
All other Mondays: Algorithms & Theory Lunch

Wed, Thu, Fri 13-14

Lunch & Learn: 1 person explains something he learned recently; n-1 people eat & learn

Below are announcements for upcoming lunches and TCS Joint Seminars.

June 9: MISC lunch -- SND continued

posted Jun 10, 2017, 1:22 AM by Dirk Oliver Theis

Bahman continued his presentation of the Survivable Network Design approximation algorithm. We got stuck on some issues about vertex solutions to the LP... :-\

June 8: A&T Seminar presentation

posted Jun 10, 2017, 1:20 AM by Dirk Oliver Theis

A straggler talk in the Algorithms & Theory seminar: Janno talked about proximal and augmented Lagrangian algorithms for machine learning problems with loss and regularization terms in the objective function.

June 7: APX Lunch on Survivable Network Design

posted Jun 10, 2017, 1:17 AM by Dirk Oliver Theis

Bahman started presenting, in all the dirty details, the iterated LP-rounding approximation algorithm for survivable network design.

June 6: Foundations of Machine Learning

posted Jun 10, 2017, 1:15 AM by Dirk Oliver Theis   [ updated Jun 10, 2017, 1:16 AM ]

After the exam stress has subsided, we started our usual schedule again: Over the summer, Tuesdays will be dedicated to the theory behind Machine Learning.

Fri May 12: Discrepancy upper bound

posted May 12, 2017, 6:31 AM by Dirk Oliver Theis   [ updated Jun 10, 2017, 1:16 AM ]

We worked through the proof that every hypergraph with sufficiently many edges has a 2-coloring with discrepancy at most sqrt( n log(m/n) ). Highlights of the proof included the use of entropy and Kleitman's theorem.

Thu May 11: Sensitivity analysis of optimization problems

posted May 12, 2017, 6:28 AM by Dirk Oliver Theis

Abdullah explained how the Lagrange multipliers form the sub-gradient or gradient, respectively, of a parameterized continuous optimization problem.

Wed May 10: Nash equilibria and sampling

posted May 12, 2017, 6:26 AM by Dirk Oliver Theis

Bahman continued his discussion of neural networks: a generator-discriminator pair sample from a given distribution.

Mon May 8: Joint TCS Pizza Seminar

posted May 12, 2017, 6:22 AM by Dirk Oliver Theis

Dominique presented a problem related to quantum query complexity of random function inversions.

Fri May 5: Problems related to edge coverings

posted May 12, 2017, 6:18 AM by Dirk Oliver Theis

Visiting PhD student Mahdi gave a survey over his research in edge coverings, vertex coverings: complexity, algorithms, combinatorics. He also discussed some of the problems he hopes to attach while in Tartu.

Thu May 4: Lagrange duality

posted May 12, 2017, 6:17 AM by Dirk Oliver Theis

Abdullah explained Lagrange duality for optimization problems: The Lagrange dual function, the Lagrange dual, weak duality theorem, strong duality, ...

1-10 of 45