#### Filter Results:

- Full text PDF available (115)

#### Publication Year

1996

2017

- This year (8)
- Last 5 years (72)
- Last 10 years (105)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Brain Region

#### Data Set Used

#### Key Phrases

#### Method

#### Organism

Learn More

- Daniel D. Lee, H. Sebastian Seung
- NIPS
- 2000

Non-negative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minimize the conventional least squares error while the other… (More)

- Jihun Ham, Daniel D. Lee
- ICML
- 2008

In this paper we propose a discriminant learning framework for problems in which data consist of linear subspaces instead of vectors. By treating subspaces as basic elements, we can make learning algorithms adapt naturally to the problems with linear invariant structures. We propose a unifying view on the subspace-based learning method by formulating the… (More)

- J Rubin, D D Lee, H Sompolinsky
- Physical review letters
- 2001

A theory of temporally asymmetric Hebb rules, which depress or potentiate synapses depending upon whether the postsynaptic cell fires before or after the presynaptic one, is presented. Using the Fokker-Planck formalism, we show that the equilibrium synaptic distribution induced by such rules is highly sensitive to the manner in which bounds on the allowed… (More)

- Jihun Ham, Daniel D. Lee, Sebastian Mika, Bernhard Schölkopf
- ICML
- 2004

We interpret several well-known algorithms for dimensionality reduction of manifolds as kernel methods. Isomap, graph Laplacian eigenmap, and locally linear embedding (LLE) all utilize local neighborhood information to construct a global embedding of the manifold. We show how all three algorithms can be described as kernel PCA on specially constructed Gram… (More)

- H.Sebastian Seung, Daniel D. Lee, Ben Y. Reis, David W. Tank
- Neuron
- 2000

Studies of the neural correlates of short-term memory in a wide variety of brain areas have found that transient inputs can cause persistent changes in rates of action potential firing, through a mechanism that remains unknown. In a premotor area that is responsible for holding the eyes still during fixation, persistent neural firing encodes the angular… (More)

- Jihun Ham, Daniel D. Lee, Lawrence K. Saul
- AISTATS
- 2005

In this paper, we study a family of semisupervised learning algorithms for “aligning” different data sets that are characterized by the same underlying manifold. The optimizations of these algorithms are based on graphs that provide a discretized approximation to the manifold. Partial alignments of the data sets—obtained from prior knowledge of their… (More)

How can we search for low dimensional structure in high dimensional data? If the data is mainly confined to a low dimensional subspace, then simple linear methods can be used to discover the subspace and estimate its dimensionality. More generally, though, if the data lies on (or near) a low dimensional submanifold, then its structure may be highly… (More)

- Fei Sha, Lawrence K. Saul, Daniel D. Lee
- NIPS
- 2002

We derive multiplicative updates for solving the nonnegative quadratic programming problem in support vector machines (SVMs). The updates have a simple closed form, and we prove that they converge monotonically to the solution of the maximum margin hyperplane. The updates optimize the traditionally proposed objective function for SVMs. They do not involve… (More)

- Fei Sha, Yuanqing Lin, Lawrence K. Saul, Daniel D. Lee
- Neural Computation
- 2007

Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the… (More)

- Olivia L White, Daniel D Lee, Haim Sompolinsky
- Physical review letters
- 2004

We study the ability of linear recurrent networks obeying discrete time dynamics to store long temporal sequences that are retrievable from the instantaneous state of the network. We calculate this temporal memory capacity for both distributed shift register and random orthogonal connectivity matrices. We show that the memory capacity of these networks… (More)