Learn More
While evidence indicates that neural systems may be employing sparse approximations to represent sensed stimuli, the mechanisms underlying this ability are not understood. We describe a locally competitive algorithm (LCA) that solves a collection of sparse coding principles minimizing a weighted combination of mean-squared error and a coefficient cost(More)
In most neural systems, neurons communicate via sequences of action potentials. Contemporary models assume that the action potentials' times of occurrence rather than their waveforms convey information. The mathematical tool for describing sequences of events occurring in time and/or space is the theory of point processes. Using this theory, we show that(More)
We describe an approach to analyzing single- and multiunit (ensemble) discharge patterns based on information-theoretic distance measures and on empirical theories derived from work in universal signal processing. In this approach, we quantify the difference between response patterns, whether time-varying or not, using information-theoretic distance(More)
Mutual information enjoys wide use in the computational neuroscience community for analyzing spiking neural systems. Its direct calculation is difficult because estimating the joint stimulus-response distribution requires a prohibitive amount of data. Consequently, several techniques have appeared for bounding mutual information that rely on less data. We(More)
Dyadic Decision Trees This thesis introduces a new family of classifiers called dyadic decision trees (DDTs) and develops their theoretical properties within the framework of statistical learning theory. First, we show that DDTs achieve optimal rates of convergence for a broad range of classification problems and are adaptive in three important respects:(More)