Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops

@article{Stelzer2021DeepNN,
  title={Deep neural networks using a single neuron: folded-in-time architecture using feedback-modulated delay loops},
  author={Florian Stelzer and Andr'e Rohm and Raul Vicente and Ingo Fischer and Serhiy Yanchuck Institute of Mathematics and Technische Universitat Berlin and H Germany and Department of Applied Mathematics and Humboldt-Universitat zu Berlin and Instituto de F'isica Interdisciplinar y Sistemas Complejos and Ifisc and Spain and Institute of Computer Science and University of Tartu and Estonia.},
  journal={Nature Communications},
  year={2021},
  volume={12}
}
Deep neural networks are among the most widely applied machine learning tools showing outstanding performance in a broad range of tasks. We present a method for folding a deep neural network of arbitrary size into a single neuron with multiple time-delayed feedback loops. This single-neuron deep neural network comprises only a single nonlinearity and appropriately adjusted modulations of the feedback signals. The network states emerge in time as a temporal unfolding of the neuron’s dynamics. By… 
7 Citations

Stability of Building Structural Engineering Based on Fractional Differential Equations

Abstract The compression rod is an important stress member of house building and bridge structure. When the load on the compression rod reaches the critical load, the entire structure will lose its

Large-scale photonic natural language processing

By exploiting the full three-dimensional structure of the optical network propagating in free space, this work overcome the interpolation threshold and reach the over-parametrized region of machine learning, a condition that allows high-performance natural language processing with a minimal fraction of training points.

An All-In-One Multifunctional Touch Sensor with Carbon-Based Gradient Resistance Elements

Diversiform human–machine interactions demonstrate the high stability, rapid response time, and excellent spatiotemporally dynamic resolution of the AIOM touch sensor, which will promote significant development of interactive sensing interfaces between fingertips and virtual objects.

Blinking coupling enhances network synchronization.

This paper studies the synchronization of a network with linear diffusive coupling, which blinks between the variables periodically. The synchronization of the blinking network in the case of

Photonic neuromorphic technologies in optical communications

The latest neuromorphic computing proposals that specifically apply to photonic hardware are reviewed and new perspectives on addressing signal processing in optical communications are given.

Connecting reservoir computing with statistical forecasting and deep neural networks

How aspects of reservoir computing can be applied to classical forecasting methods to accelerate the learning process is reported on, and a new approach is highlighted that makes the hardware implementation of traditional machine learning algorithms practicable in electronic and photonic systems.

Attention-Based CNN and Bi-LSTM Model Based on TF-IDF and GloVe Word Embedding for Sentiment Analysis

A novel attention-based model that utilizes CNNs with LSTM (named ACL-SA) that significantly outperforms the state-of-the-art models and uses an integrated bidirectional L STM to capture long-term dependencies.

References

SHOWING 1-10 OF 60 REFERENCES

Fashion-MNIST: a Novel Image Dataset for Benchmarking Machine Learning Algorithms

Fashion-MNIST is intended to serve as a direct drop-in replacement for the original MNIST dataset for benchmarking machine learning algorithms, as it shares the same image size, data format and the structure of training and testing splits.

Learning Multiple Layers of Features from Tiny Images

It is shown how to train a multi-layer generative model that learns to extract meaningful features which resemble those found in the human visual cortex, using a novel parallelization algorithm to distribute the work among multiple machines connected on a network.

Language Models are Few-Shot Learners

GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic.

Photonic Neural Networks: A Survey

This work proposes a taxonomy of the existing solutions of photonic artificial neural networks (categorized into multilayer perceptrons, convolutional neural networks, spiking neural Networks, and reservoir computing) with emphasis on proof-of-concept implementations.

Learning One-Shot Imitation From Humans Without Humans

With Task-Embedded Control Networks, the system can infer control polices by embedding human demonstrations that can condition a control policy and achieve one-shot imitation learning with similar results by utilising only simulation data.

Protein structure prediction beyond AlphaFold

  • G. Wei
  • Computer Science
    Nat. Mach. Intell.
  • 2019
DeepFragLib, a new protein-specific fragment library built using deep neural networks, may have advanced the field to the next stage of protein structure prediction.

Reinforcement learning in artificial and biological systems

This Review describes state-of-the-art work on RL in biological and artificial agents and focuses on points of contact between these disciplines and identifies areas where future research can benefit from information flow between these fields.

Advances in photonic reservoir computing

A novel paradigm that has emerged in analogue neuromorphic optical computing is reviewed: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

Spatio-temporal phenomena in complex systems with time delays

Real-world systems can be strongly influenced by time delays occurring in self-coupling interactions, due to unavoidable finite signal propagation velocities. When the delays become significantly
...