Learn More
In this paper we propose a discriminant learning framework for problems in which data consist of linear subspaces instead of vectors. By treating subspaces as basic elements, we can make learning algorithms adapt naturally to the problems with linear invariant structures. We propose a unifying view on the subspace-based learning method by formulating the(More)
We interpret several well-known algorithms for dimensionality reduction of manifolds as kernel methods. Isomap, graph Laplacian eigenmap, and locally linear embedding (LLE) all utilize local neighborhood information to construct a global embedding of the manifold. We show how all three algorithms can be described as kernel PCA on specially constructed Gram(More)
A theory of temporally asymmetric Hebb rules, which depress or potentiate synapses depending upon whether the postsynaptic cell fires before or after the presynaptic one, is presented. Using the Fokker-Planck formalism, we show that the equilibrium synaptic distribution induced by such rules is highly sensitive to the manner in which bounds on the allowed(More)
Studies of the neural correlates of short-term memory in a wide variety of brain areas have found that transient inputs can cause persistent changes in rates of action potential firing, through a mechanism that remains unknown. In a premotor area that is responsible for holding the eyes still during fixation, persistent neural firing encodes the angular(More)
We derive multiplicative updates for solving the nonnegative quadratic programming problem in support vector machines (SVMs). The updates have a simple closed form, and we prove that they converge monotoni-cally to the solution of the maximum margin hyperplane. The updates optimize the traditionally proposed objective function for SVMs. They do not involve(More)
How can we search for low dimensional structure in high dimensional data? If the data is mainly confined to a low dimensional subspace, then simple linear methods can be used to discover the subspace and estimate its dimensionality. More generally, though, if the data lies on (or near) a low dimensional submanifold, then its structure may be highly(More)
We study the ability of linear recurrent networks obeying discrete time dynamics to store long temporal sequences that are retrievable from the instantaneous state of the network. We calculate this temporal memory capacity for both distributed shift register and random orthogonal connectivity matrices. We show that the memory capacity of these networks(More)
Inactivation of the beta4 subunit of the calcium channel in the mouse neurological mutant lethargic results in a complex neurological disorder that includes absence epilepsy and ataxia. To determine the role of the calcium-channel beta4-subunit gene CACNB4 on chromosome 2q22-23 in related human disorders, we screened for mutations in small pedigrees with(More)