Continuous Online Sequence Learning with an Unsupervised Neural Network Model

@article{Cui2016ContinuousOS,
  title={Continuous Online Sequence Learning with an Unsupervised Neural Network Model},
  author={Yuwei Cui and Chetan Surpur and Subutai Ahmad and Jeff Hawkins},
  journal={Neural Computation},
  year={2016},
  volume={28},
  pages={2474-2504}
}
Abstract The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to… 

Sequence learning, prediction, and replay in networks of spiking neurons

TLDR
A continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters, facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities.

Memorize-Generalize: An online algorithm for learning higher-order sequential structure with cloned Hidden Markov Models

TLDR
It is demonstrated that CHMMs trained with the memorize-generalize algorithm can model long-range structure in bird songs with only a slight degradation in performance compared to expectation-maximization, while still outperforming other representations.

A comparative study of HTM and other neural network models for online sequence learning with streaming data

TLDR
A comparative study of Hierarchical Temporal Memory (HTM), a neurally-inspired model, and other feedforward and recurrent artificial neural network models on both artificial and real-world sequence prediction algorithms shows HTM and long-short term memory (LSTM) give the best prediction accuracy.

Spatiotemporal Sequence Memory for Prediction using Deep Sparse Coding

TLDR
This work sought to create an artificial model that is able to mimic early, low level biological predictive behavior in a computer vision system using spatiotemporal sequence memories learned from deep sparse coding and is implemented using a biologically inspired architecture.

A New Hierarchical Temporal Memory Algorithm Based on Activation Intensity

TLDR
A self-adaptive nonlinear learning strategy is proposed where the synaptic connections are dynamically adjusted according to the activation intensity of columns, which achieves higher accuracy and less time overhead than the conventional HTM and LSTM model.

Hierarchical Temporal Memory Introducing Time Axis in Connection Segments

  • Shinichiro NaitoM. Hagiwara
  • Computer Science
    2018 Joint 10th International Conference on Soft Computing and Intelligent Systems (SCIS) and 19th International Symposium on Advanced Intelligent Systems (ISIS)
  • 2018
TLDR
An improved Hierarchical Temporal Memory that can consider long-term dependence is proposed and it was confirmed that the proposed model can consider longer-term dependency than the conventional model on temporal sequence prediction.

Using High-Order Prior Belief Predictions in Hierarchical Temporal Memory for Streaming Anomaly Detection

TLDR
Experimental results suggest that the framework when built upon HTM redefines state-of-the-art performance in a popular streaming anomaly benchmark, and Comparative results with and without the framework on several third-party datasets collected from real-world scenarios show a clear performance benefit.

Deviant Learning Algorithm: Learning Sparse Mismatch Representations through Time and Space

TLDR
This paper proposes a novel bio-mimetic computational intelligence algorithm – the Deviant Learning Algorithm, inspired by these key ideas and functional properties of recent brain-cognitive discoveries and theories and shows by numerical experiments guided by theoretical insights, how this invention can achieve competitive predictions even with very small problem specific data.

Learning higher-order sequential structure with cloned HMMs

TLDR
The experiments show that CHMMs can beat n-grams, sequence memoizers, and LSTMs on character-level language modeling tasks and can be a viable alternative to these methods in some tasks that require variable order sequence modeling and the handling of uncertainty.
...

References

SHOWING 1-10 OF 90 REFERENCES

A Critical Review of Recurrent Neural Networks for Sequence Learning

TLDR
The goal of this survey is to provide a selfcontained explication of the state of the art of recurrent neural networks together with a historical perspective and references to primary research.

Long Short-Term Memory

TLDR
A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.

Matching Recall and Storage in Sequence Learning with Spiking Neural Networks

TLDR
A generic learning rule is derived that is matched to the neural dynamics by minimizing an upper bound on the Kullback–Leibler divergence from the target distribution to the model distribution and is consistent with spike-timing dependent plasticity.

Supervised Learning in Spiking Neural Networks with ReSuMe: Sequence Learning, Classification, and Spike Shifting

TLDR
A model of supervised learning for biologically plausible neurons is presented that enables spiking neurons to reproduce arbitrary template spike patterns in response to given synaptic stimuli even in the presence of various sources of noise and shows that the learning rule can also be used for decision-making tasks.

A Fast and Accurate Online Sequential Learning Algorithm for Feedforward Networks

TLDR
The results show that the OS-ELM is faster than the other sequential algorithms and produces better generalization performance on benchmark problems drawn from the regression, classification and time series prediction areas.

Sequence to Sequence Learning with Neural Networks

TLDR
This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.

A Mathematical Formalization of Hierarchical Temporal Memory’s Spatial Pooler

TLDR
This work brings together all aspects of the spatial pooler (SP), a critical learning component in HTM, under a single unifying framework and empirical evidence verifies that given the proper parameterizations, the SP may be used for feature learning.

Learned spatiotemporal sequence recognition and prediction in primary visual cortex

TLDR
This work discovered that repeated presentations of a visual sequence over a course of days resulted in evoked response potentiation in mouse V1 that was highly specific for stimulus order and timing.

A Learning Algorithm for Continually Running Fully Recurrent Neural Networks

The exact form of a gradient-following learning algorithm for completely recurrent networks running in continually sampled time is derived and used as the basis for practical algorithms for temporal

Sparse Distributed Memory

TLDR
Pentti Kanerva's Sparse Distributed Memory presents a mathematically elegant theory of human long term memory that resembles the cortex of the cerebellum, and provides an overall perspective on neural systems.
...