Sparse coding with memristor networks.

@article{Sheridan2017SparseCW,
  title={Sparse coding with memristor networks.},
  author={Patrick Sheridan and Fuxi Cai and Chao Du and Wen Ma and Zhengya Zhang and Wei D. Lu},
  journal={Nature nanotechnology},
  year={2017},
  volume={12 8},
  pages={
          784-789
        }
}
Sparse representation of information provides a powerful means to perform feature extraction on high-dimensional data and is of broad interest for applications in signal processing, computer vision, object recognition and neurobiology. Sparse coding is also believed to be a key mechanism by which biological neural systems can efficiently process a large amount of complex sensory data while consuming very little power. Here, we report the experimental implementation of sparse coding algorithms… 
Adaptive sparse coding based on memristive neural network with applications
TLDR
A soft-threshold adaptive sparse coding algorithm named MMN-SLCA based on the memristor, neural network and sparse coding theory is proposed and its superior potentials in large-scale low-power intelligent information coding and processing are shown.
Feature extraction and analysis using memristor networks
  • Fuxi Cai, Wei D. Lu
  • Computer Science
    2018 IEEE International Symposium on Circuits and Systems (ISCAS)
  • 2018
TLDR
Through hardware implementation of a sparse coding algorithm in a fabricated 32×32 memristor array, lateral inhibition among neurons is obtained and allows the network to settle to a more optimal, sparse solution from many possible solutions.
Fast and Accurate Sparse Coding of Visual Stimuli With a Simple, Ultralow-Energy Spiking Architecture
TLDR
It was shown that connecting the neurons directly to the crossbar resulted in a more energy-efficient sparse coding architecture and alleviated the need to prenormalize receptive fields and the proposed architecture’s excellent accuracy, throughput, and significantly lower energy usage demonstrate the utility of the innovations.
Spiking Sparse Coding Algorithm with Reduced Inhibitory Feedback Weights
  • Md Munir Hasan, J. Holleman
  • Computer Science
    2020 IEEE 63rd International Midwest Symposium on Circuits and Systems (MWSCAS)
  • 2020
TLDR
It is demonstrated that a sparse coding algorithm using spiking neurons can be designed to have reduced inhibitory feedback connections by modifying SAILNet, and it is experimentally shown that by tuning the value of inhibitory synapse strength, sparsity can be controlled.
K-means Data Clustering with Memristor Networks.
TLDR
Experimental implementation of memristor crossbar hardware systems that can allow direct comparison of the Euclidean distances without normalizing the weights, and enables unsupervised K-means clustering algorithm through online learning, and produces high classification accuracy for the standard IRIS data set.
Multichannel parallel processing of neural signals in memristor arrays
TLDR
It is suggested that memristor arrays could be a promising multichannel signal processing module for future implantable neural interfaces.
Sparse Coding Using the Locally Competitive Algorithm on the TrueNorth Neurosynaptic System
TLDR
This paper maps the LCA algorithm on the brain-inspired, IBM TrueNorth Neurosynaptic System and discusses data structures and representation as well as the architecture of functional processing units that perform non-linear threshold, vector-matrix multiplication.
Power-efficient neural network with artificial dendrites
TLDR
A memristor-based artificial dendrite enables the neural network to perform high-accuracy computation tasks with reduced power consumption and shows the potential of substantial overall performance improvement.
Synaptic Device Network Architecture with Feature Extraction for Unsupervised Image Classification.
TLDR
A synaptic device network architecture with a feature extraction algorithm inspired by the convolutional neural network is demonstrated and can classify handwritten digits at up to a 90% recognition rate despite using fewer synaptic devices than the architecture without feature extraction.
Sparse neuromorphic computing based on spin-torque diodes
TLDR
The results suggest that STDs have potential to be building blocks for the realization of a biologically plausible neuromorphic computing system.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 29 REFERENCES
Feature Extraction Using Memristor Networks
TLDR
It is shown that, with proper compensation, the memristor crossbar architecture can effectively perform sparse coding with distortion comparable with ideal software implementations at high sparsity, even in the presence of large device-to-device variations in the excess of 100%.
Sparse Coding via Thresholding and Local Competition in Neural Circuits
TLDR
A locally competitive algorithm (LCA) is described that solves a collection of sparse coding principles minimizing a weighted combination of mean-squared error and a coefficient cost function to produce coefficients with sparsity levels comparable to the most popular centralized sparse coding algorithms while being readily suited for neural implementation.
Energy Scaling Advantages of Resistive Memory Crossbar Based Computation and Its Application to Sparse Coding
TLDR
This paper presents a kernels-based architecture for sparse coding that can be applied to a neural sparse coding algorithm to give an O(N) reduction in energy for the entire algorithm when run with finite precision.
Sparse coding and decorrelation in primary visual cortex during natural vision.
Theoretical studies suggest that primary visual cortex (area V1) uses a sparse code to efficiently represent natural scenes. This issue was investigated by recording from V1 neurons in awake behaving
A Sparse Coding Model with Synaptically Local Plasticity and Spiking Neurons Can Account for the Diverse Shapes of V1 Simple Cell Receptive Fields
TLDR
It is proved, mathematically, that sparseness and decorrelation are the key ingredients that allow for synaptically local plasticity rules to optimize a cooperative, linear generative image model formed by the neural representation.
Pattern classification by memristive crossbar circuits using ex situ and in situ training.
Memristors are memory resistors that promise the efficient implementation of synaptic weights in artificial neural networks. Whereas demonstrations of the synaptic operation of memristors already
Computer science: Nanoscale connections for brain-like circuits
TLDR
A transistor-free metal-oxide memristor network with low device variability that works as a single-layer perceptron that can learn to recognize imperfect 3 × 3 pixel black-and-white patterns as one of three letters of the alphabet.
Emergence of simple-cell receptive field properties by learning a sparse code for natural images
TLDR
It is shown that a learning algorithm that attempts to find sparse linear codes for natural scenes will develop a complete family of localized, oriented, bandpass receptive fields, similar to those found in the primary visual cortex.
Sparse Representation for Computer Vision and Pattern Recognition
TLDR
This review paper highlights a few representative examples of how the interaction between sparse signal representation and computer vision can enrich both fields, and raises a number of open questions for further study.
...
1
2
3
...