#### Filter Results:

#### Publication Year

2013

2017

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

Deep neural nets have caused a revolution in many classification tasks. A related ongoing revolution—also theoretically not understood—concerns their ability to serve as generative models for complicated types of data such as images and texts. These models are trained using ideas like variational autoencoders and Generative Adversarial Networks. We take a… (More)

- Holden Lee
- ArXiv
- 2015

An open problem in complexity theory is to find the minimal degree of a polynomial representing the n-bit OR function modulo composite m. This problem is related to understanding the power of circuits with MOD m gates where m is composite. The OR function is of particular interest because it is the simplest function not amenable to bounds from communication… (More)

- Sanjeev Arora, Holden Lee
- 2015

Today we talk about tensor decomposition, a general purpose tool for learning latent variable models. Then we switch gears and talk about a recent improvement of the topic modeling algorithm we saw in an earlier lecture. Tensor decomposition is the analog of spectral decomposition for tensors. The nice thing about eigenvalues/eigenvectors is that they exist… (More)

- EVA BELMONT, HOLDEN LEE, ALEXANDRA MUSAT, SARAH TREBAT-LEDER
- 2013

Folsom, Kent, and Ono used the theory of modular forms modulo to establish remarkable " self-similarity " properties of the partition function and give an overarching explanation of many partition congruences. We generalize their work to analyze powers p r of the partition function as well as Andrews's spt-function. By showing that certain generating… (More)

- ‹
- 1
- ›