Lunches

Our lunches are intended to be a low-stress high-fun gatherings.  Bring your salad or sandwich!

Mon 14-16

Every 1st Monday of the month: Joint Theoretical Computer Science Pizza Seminar (all of TCS)
All other Mondays: Algorithms & Theory Lunch

Wed, Thu, Fri 13-14

Lunch & Learn: 1 person explains something he learned recently; n-1 people eat & learn


Below are announcements for upcoming lunches and TCS Joint Seminars.

Fri May 11: Discrepancy upper bound

posted May 12, 2017, 6:31 AM by Dirk Oliver Theis

We worked through the proof that every hypergraph with sufficiently many edges has a 2-coloring with discrepancy at most sqrt( n log(m/n) ). Highlights of the proof included the use of entropy and Kleitman's theorem.

Thu May 11: Sensitivity analysis of optimization problems

posted May 12, 2017, 6:28 AM by Dirk Oliver Theis

Abdullah explained how the Lagrange multipliers form the sub-gradient or gradient, respectively, of a parameterized continuous optimization problem.

Wed May 10: Nash equilibria and sampling

posted May 12, 2017, 6:26 AM by Dirk Oliver Theis

Bahman continued his discussion of neural networks: a generator-discriminator pair sample from a given distribution.

Mon May 8: Joint TCS Pizza Seminar

posted May 12, 2017, 6:22 AM by Dirk Oliver Theis

Dominique presented a problem related to quantum query complexity of random function inversions.

Fri May 5: Problems related to edge coverings

posted May 12, 2017, 6:18 AM by Dirk Oliver Theis

Visiting PhD student Mahdi gave a survey over his research in edge coverings, vertex coverings: complexity, algorithms, combinatorics. He also discussed some of the problems he hopes to attach while in Tartu.

Thu May 4: Lagrange duality

posted May 12, 2017, 6:17 AM by Dirk Oliver Theis

Abdullah explained Lagrange duality for optimization problems: The Lagrange dual function, the Lagrange dual, weak duality theorem, strong duality, ...

Wed May 3: Training neural networks

posted May 12, 2017, 6:15 AM by Dirk Oliver Theis

Bahman explained how to train feed-forward neural networks by gradient descent, using backpropagation.

June 15: Maximal m-free digraphs

posted Jun 15, 2015, 5:50 AM by Dirk Oliver Theis

Today, Janno Siim gave his seminar presentation on maximal m-free digraphs. He presented, for example, the proof for the fact every maximal m-free digraphs has an m-king.

June 8: Zero-nonzero patterns of low-degree polynomials

posted Jun 7, 2015, 4:01 AM by Dirk Oliver Theis

On Monday, DOT will present a theorem of Ronyai, Babai, and Ganapathy which gives an upper bound to the number of zero / nonzero patterns of a sequence of n polynomials in m variables of degree at most d. The perhaps most interesting fact is that the bound does not depend on n---only on m, d, and the number of non-zeros in the pattern.
We will then discuss an application, due to Leslie Hogben and coauthors, to random matrix theory: the minimum rank of a matrix with a prescribed, random, zero-nonzero pattern.

May 25: Probabilistic recurrences II

posted May 31, 2015, 9:48 PM by Dirk Oliver Theis

This week, Abdullah continued the proofs of the Karp's tail bounds on probabilistic recurrence relations.

1-10 of 41