Preventing Deterioration of Classification Accuracy in Predictive Coding Networks

@article{Kinghorn2022PreventingDO,
  title={Preventing Deterioration of Classification Accuracy in Predictive Coding Networks},
  author={Paul Kinghorn and Beren Millidge and Christopher L. Buckley},
  journal={ArXiv},
  year={2022},
  volume={abs/2208.07114}
}
. Predictive Coding Networks (PCNs) aim to learn a generative model of the world. Given observations, this generative model can then be inverted to infer the causes of those observations. However, when training PCNs, a noticeable pathology is often observed where inference accuracy peaks and then declines with further training. This cannot be explained by overfitting since both training and test accuracy decrease simultaneously. Here we provide a thorough investigation of this phenomenon and… 

Figures from this paper

References

SHOWING 1-10 OF 29 REFERENCES

Learning on Arbitrary Graph Topologies via Predictive Coding

This paper shows how predictive coding (PC), a theory of information processing in the cortex, can be used to perform inference and learning on arbitrary graph topologies, and experimentally shows how this formulation, called PC graphs, can easily perform different tasks with the same network by simply stimulating specific neurons.

Hybrid Predictive Coding: Inferring, Fast and Slow

This work proposes a hybrid predictive coding network that combines both iterative and amortized inference in a principled manner by describing both in terms of a dual optimization of a single objective function and demonstrates that the resulting scheme can be implemented in a biologically plausible neural architecture that approximates Bayesian inference utilising local Hebbian update rules.

Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs

Predictive coding converges asymptotically (and in practice, rapidly) to exact backprop gradients on arbitrary computation graphs using only local learning rules, raising the potential that standard machine learning algorithms could in principle be directly implemented in neural circuitry.

Predictive Coding: a Theoretical and Experimental Review

This work provides a comprehensive review both of the core mathematical structure and logic of predictive coding, thus complementing recent tutorials in the literature and surveying the close relationships between predictive coding and modern machine learning techniques.

Can the Brain Do Backpropagation? - Exact Implementation of Backpropagation in Predictive Coding Networks

A BL model is proposed that produces exactly the same updates of the neural weights as BP, while employing local plasticity, i.e., all neurons perform only local computations, done simultaneously and is modified to modify it to an alternative BL model that works fully autonomously.

Hierarchical Models in the Brain

A general model that subsumes many parametric models for continuous data that can be inverted using exactly the same scheme, namely, dynamic expectation maximization, and is formulated as a simple neural network that may provide a useful metaphor for inference and learning in the brain.

The Helmholtz Machine

A way of finessing this combinatorial explosion by maximizing an easily computed lower bound on the probability of the observations is described, viewed as a form of hierarchical self-supervised learning that may relate to the function of bottom-up and top-down cortical processing pathways.

Learning and inference in the brain

Whatever next? Predictive brains, situated agents, and the future of cognitive science.

  • A. Clark
  • Biology
    The Behavioral and brain sciences
  • 2013
This target article critically examines this "hierarchical prediction machine" approach, concluding that it offers the best clue yet to the shape of a unified science of mind and action.

A tutorial on the free-energy framework for modelling perception and learning

  • R. Bogacz
  • Biology, Computer Science
    Journal of mathematical psychology
  • 2017