Flatiron Institute Seminar Series: Lawrence Saul

Date


Title: A curious correspondence between sparse and low-rank matrices and its uses for machine learning

Abstract: Many problems in high dimensional data analysis can be formulated as a search for structure in large matrices. One important type of structure is sparsity; for example, when a matrix is sparse, with a large number of zero elements, it can be stored in a highly compressed format. Another type of structure is linear dependence; when a matrix is low-rank, it can be expressed as the product of two smaller matrices. It is well known that neither one of these structures implies the other, but can one find more subtle connections by looking beyond the canonical decompositions of linear algebra? In this talk, I will consider when a sparse nonnegative matrix can be recovered from a dense matrix of significantly lower rank. I will describe an algorithm for this problem based on a nonlinear decomposition of sparse matrices, and I will discuss various settings in machine learning where this problem arises in a natural way. Arguably the most popular matrix decompositions are those–such as principal component analysis, or nonnegative matrix factorization–that have a simple geometric interpretation. I will show that these nonlinear decompositions also have a simple geometric interpretation, but one that is quite different in character.

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates

privacy consent banner

Privacy preference

We use cookies to provide you with the best online experience. By clicking "Accept All," you help us understand how our site is used and enhance its performance. You can change your choice at any time here. To learn more, please visit our Privacy Policy.