# Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks

@article{Obeid2019StructuredAD, title={Structured and Deep Similarity Matching via Structured and Deep Hebbian Networks}, author={Dina Obeid and Hugo Ramambason and Cengiz Pehlevan}, journal={ArXiv}, year={2019}, volume={abs/1910.04958} }

Synaptic plasticity is widely accepted to be the mechanism behind learning in the brain's neural networks. A central question is how synapses, with access to only local information about the network, can still organize collectively and perform circuit-wide learning in an efficient manner. In single-layered and all-to-all connected neural networks, local plasticity has been shown to implement gradient-based learning on a class of cost functions that contain a term that aligns the similarity of…

## 14 Citations

### Contrastive Similarity Matching for Supervised Learning

- Computer ScienceNeural Computation
- 2021

A novel biologically plausible solution to the credit assignment problem motivated by observations in the ventral visual pathway and trained deep neural networks that exhibit biologically plausible Hebbian and anti-Hebbian plasticity is proposed.

### K ERNEL SIMILARITY MATCHING WITH H EBBIAN NEU RAL NETWORKS

- Computer Science
- 2022

The algorithm proceeds by deriving and then minimizing an upper bound for the sum of squared errors between output and input kernel similarities, which leads to online correlation-based learning rules which can be implemented with a 1 layer recurrent neural network.

### Reverse Differentiation via Predictive Coding

- Computer ScienceAAAI
- 2022

This work generalizes (PC and) Z-IL by directly defining it on computational graphs, and shows that it can perform exact reverse differentiation, which results in the first PC algorithm that is equivalent to BP in the way of updating parameters on any neural network.

### Predictive Coding Can Do Exact Backpropagation on Any Neural Network

- Computer ScienceArXiv
- 2021

This is the first biologically plausible algorithm that is shown to be equivalent to BP in the way of updating parameters on any neural network, and it is thus a great breakthrough for the interdisciplinary research of neuroscience and deep learning.

### A normative framework for deriving neural networks with multi-compartmental neurons and non-Hebbian plasticity

- Computer ScienceArXiv
- 2023

This article reviews and unify recent extensions of the similarity matching approach to address more complex objectives, including a broad range of unsupervised and self-supervised learning tasks that can be formulated as generalized eigenvalue problems or nonnegative matrix factorization problems.

### Feature Learning in L2-regularized DNNs: Attraction/Repulsion and Sparsity

- Computer ScienceArXiv
- 2022

A sparsity result for homogeneous DNNs is proved: any local minimum of the L 2 -regularized loss can be achieved with at most N ( N + 1) neurons in each hidden layer (where N is the size of the training set).

### Training Deep Architectures Without End-to-End Backpropagation: A Survey on the Provably Optimal Methods

- Computer ScienceIEEE Computational Intelligence Magazine
- 2022

This tutorial paper surveys provably optimal alternatives to end-to-end backpropagation (E2EBP) — the de facto standard for training deep architectures that allow for greater modularity and transparency in deep learning workflows, aligning deep learning with the mainstream computer science engineering that heavily exploits modularization for scalability.

### Training Deep Architectures Without End-to-End Backpropagation: A Brief Survey

- Computer ScienceArXiv
- 2021

This tutorial paper surveys training alternatives to end-to-end backpropagation (E2EBP) — the de facto standard for training deep architectures that allow for greater modularity and transparency in deep learning workflows, aligning deep learning with the mainstream computer science engineering that heavily exploits modularization for scalability.

### Knowledge Distillation for Improved Accuracy in Spoken Question Answering

- Computer ScienceICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2021

This work devise a training strategy to perform knowledge distillation (KD) from spoken documents and written counterparts to improve the performance of the student model by reducing the misalignment between automatic and manual transcripts.

### Supervised Deep Similarity Matching

- Computer ScienceArXiv
- 2020

A novel biologically-plausible solution to the credit assignment problem is proposed, being motivated by observations in the ventral visual pathway and trained deep neural networks, and using a supervised deep similarity matching cost function to motivate a layer-specific learning goal in a deep network.

## References

SHOWING 1-10 OF 40 REFERENCES

### A Hebbian/Anti-Hebbian network for online sparse dictionary learning derived from symmetric matrix factorization

- Computer Science2014 48th Asilomar Conference on Signals, Systems and Computers
- 2014

An online algorithm is derived, which learns Gabor-filter receptive fields from a natural image ensemble in agreement with physiological experiments and maps onto a neural network of the same architecture as OF but using only biologically plausible local learning rules.

### An Approximation of the Error Backpropagation Algorithm in a Predictive Coding Network with Local Hebbian Synaptic Plasticity

- Biology, Computer ScienceNeural Computation
- 2017

It is shown that a network developed in the predictive coding framework can efficiently perform supervised learning fully autonomously, employing only simple local Hebbian plasticity.

### A Hebbian/Anti-Hebbian network derived from online non-negative matrix factorization can cluster and discover sparse features

- Computer Science, Biology2014 48th Asilomar Conference on Signals, Systems and Computers
- 2014

The hypothesis that single-layer neuronal networks perform online symmetric nonnegative matrix factorization (SNMF) of the similarity matrix of the streamed data is explored and an online algorithm is derived which can be implemented by a biologically plausible network with local learning rules.

### Neuroscience-inspired online unsupervised learning algorithms

- Computer ScienceArXiv
- 2019

This work developed a family of biologically plausible artificial neural networks (NNs) for unsupervised learning based on optimizing principled objective functions containing a term that matches the pairwise similarity of outputs to the similarity of inputs, hence the name - similarity-based.

### Unsupervised learning by competing hidden units

- Computer ScienceProceedings of the National Academy of Sciences
- 2019

A learning algorithm is designed that utilizes global inhibition in the hidden layer and is capable of learning early feature detectors in a completely unsupervised way, and which is motivated by Hebb’s idea that change of the synapse strength should be local.

### Biologically Plausible Online Principal Component Analysis Without Recurrent Neural Dynamics

- Computer Science2018 52nd Asilomar Conference on Signals, Systems, and Computers
- 2018

This work derives a network for PCA-based dimensionality reduction that avoids this fast fixed-point iteration of the network, using a modification of the similarity matching objective to encourage near-diagonality of a synaptic weight matrix.

### Online Representation Learning with Single and Multi-layer Hebbian Networks for Image Classification

- Computer ScienceICANN
- 2017

A new class of Hebbian-like and local unsupervised learning rules for neural networks that minimise a similarity matching cost-function are developed and applied to both single and multi-layer architectures, suggesting its validity in the design of a newclass of compact, online learning networks.

### A Hebbian/Anti-Hebbian Neural Network for Linear Subspace Learning: A Derivation from Multidimensional Scaling of Streaming Data

- Computer ScienceNeural Computation
- 2015

A biologically plausible network for subspace learning on streaming data is derived by minimizing a principled cost function by adopting a multidimensional scaling cost function for streaming data and relying only on biologically plausible Hebbian and anti-Hebbian local learning rules.

### Towards deep learning with segregated dendrites

- Computer Science, BiologyeLife
- 2017

It is shown that a deep learning algorithm that utilizes multi-compartment neurons might help to understand how the neocortex optimizes cost functions, and the algorithm takes advantage of multilayer architectures to identify useful higher-order representations—the hallmark of deep learning.

### Random synaptic feedback weights support error backpropagation for deep learning

- Computer ScienceNature Communications
- 2016

A surprisingly simple mechanism that assigns blame by multiplying errors by even random synaptic weights is presented, which can transmit teaching signals across multiple layers of neurons and performs as effectively as backpropagation on a variety of tasks.