Simons Collaboration on Algorithms & Geometry 2019 Annual Meeting

Date & Time


Talks

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Nisheeth Vishnoi

Physics-inspired Algorithms: Hamiltonian Monte Carlo for Sampling 

 

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Dor Minzer
based on joint works with Peter Keevash, Noam Lifshitz and Eoin Long

New forms of hypercontractivity

The classical hypercontractive inequality for the Boolean hypercube lies at the core of many results in analysis of Boolean functions. Though extensions of the inequality to different domains (e.g. the biased hypercube) are known,they are often times quantitatively weak, making them hard to apply.

In this survey talk, we will discuss new forms of the hypercontractive inequality.These new forms are quantitatively stronger for the class of “global functions”, which are functions in which no small set of variables can change the value of the function with significant probability. We will also discuss some applications to the study of sharp-thresholds, noise sensitivity on the biased hypercube and extremal combinatorics.

 

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Nike Sun
joint work with Jian Ding

Capacity lower bound for the Ising perceptron

 

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Ilya Razenshteyn
joint work with S.Mahabadi, K.Makarychev, Y.Makarychev

Extension theorems, dimension reduction, and clustering

Prof. Razenshteyn will talk about two recent results that are concerned with extensions theorems, dimension reduction, and clustering.

First, we show several new bi-Lipschitz extension theorems and use them to obtain two dimension reduction statements that strengthen the classic Johnson–Lindenstrauss theorem, which, in turn, imply better coresets for clustering. Second, we show almost tight dimension reduction for clustering along the way introducing and using a robust version of the classic Kirszbraun Lipschitz extension theorem.

 

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Nikhil Srivastava
joint work with Jess Banks, Archit Kulkarni, Satyaki Mukherjee

Quantitative Diagonalizability

A diagonalizable matrix has linearly independent eigenvectors. Since the set of nondiagonalizable matrices has measure zero, every matrix is the limit of diagonalizable matrices. We prove a quantitative version of this fact: every n x n complex matrix is within distance delta of a matrix whose eigenvectors have condition number poly(n)/delta, confirming a conjecture of E. B. Davies. The proof is based on regularizing the pseudospectrum with a complex Gaussian perturbation.

 

Video Thumbnail

By clicking to watch this video, you agree to our privacy policy.

Ran Raz
Based on joint works with Sumegha Garg, Gillat Kol, and Avishay Tal

Learning Fast Requires Good Memory: Time-Space Tradeoff Lower Bounds for Learning

For further information on Prof. Raz’s Lecture, see its event page.

 

Subscribe to MPS announcements and other foundation updates