The Capacity of Quantum Neural Networks

@article{Wright2020TheCO,
  title={The Capacity of Quantum Neural Networks},
  author={Logan G. Wright and Peter Leonard McMahon},
  journal={2020 Conference on Lasers and Electro-Optics (CLEO)},
  year={2020},
  pages={1-2}
}
  • L. Wright, P. McMahon
  • Published 4 August 2019
  • Physics, Computer Science
  • 2020 Conference on Lasers and Electro-Optics (CLEO)
Quantum neural networks (QNN) are a promising application of near-term quantum computers. We present an information theory of QNN's expressive power, which we apply to an example optical QNN based on a Gaussian Boson Sampler. 

Figures from this paper

Quantum computing models for artificial neural networks

TLDR
An overview of the most recent proposals aimed at bringing together these ongoing revolutions in Machine Learning and Artificial Intelligence, and particularly at implementing the key functionalities of artificial neural networks on quantum architectures.

The power of quantum neural networks

TLDR
This work is the first to demonstrate that well-designed quantum neural networks offer an advantage over classical neural networks through a higher effective dimension and faster training ability, which is verified on real quantum hardware.

QNet: A Scalable and Noise-Resilient Quantum Neural Network Architecture for Noisy Intermediate-Scale Quantum Computers

TLDR
QNet provides a blueprint to build noise-resilient QML models with a collection of small quantum neural networks with near-term noisy quantum devices and shows 43% better accuracy on average over the existing models on noisy quantum hardware emulators.

Variational Learning for Quantum Artificial Neural Networks

TLDR
This work presents an original realization of efficient individual quantum nodes based on variational unsampling protocols, and investigates different learning strategies involving global and local layerwise cost functions, and assess their performances also in the presence of statistical measurement noise.

Variational learning for quantum artificial neural networks

TLDR
An original realization of efficient individual quantum nodes based on variational unsampling protocols is presented, suggesting a viable approach towards the use of quantum neural networks for pattern classification on near-term quantum hardware.

On the learnability of quantum neural networks

TLDR
This work derives the convergence performance of QNN under the NISQ setting, and identifies classes of computationally hard concepts that can be efficiently learned by QNN, and proves that any concept class, which is efficiently learnable by a restricted quantum statistical query (QSQ) learning model, can also be efficiently learning byQNN.

DeepQMLP: A Scalable Quantum-Classical Hybrid Deep Neural Network Architecture for Classification

  • M. AlamSwaroop Ghosh
  • Computer Science
    2022 35th International Conference on VLSI Design and 2022 21st International Conference on Embedded Systems (VLSID)
  • 2022
TLDR
It is shown that DeepQMLP performs reasonably well on unseen data and exhibits greater resilience to noise over QNN models that use a deep quantum circuit, and up to 25.3% lower loss and 7.92% higher accuracy during inference under noise than QMLP.

An Empirical Study of Quantum Dynamics as a Ground State Problem with Neural Quantum States

. Neural quantum states are variational wave functions parameterised by artificial neural networks, a mathematical model studied for decades in the machine learning community. In the context of

Quantum reservoir computing with a single nonlinear oscillator

TLDR
The results show that quantum reservoir computing in a single nonlinear oscillator is an attractive modality for quantum computing on near-term hardware and may impact the interpretation of results across quantum machine learning.

Nonlinear input transformations are ubiquitous in quantum reservoir computing

TLDR
It is found that across the majority of schemes the input encoding implements a nonlinear transformation on the input data, which calls into question the necessity and function of further, post-input, processing.

References

SHOWING 1-10 OF 43 REFERENCES

Quantum circuit learning

TLDR
A classical-quantum hybrid algorithm for machine learning on near-term quantum processors, which is hybridizing a low-depth quantum circuit and a classical computer for machinelearning, paves the way toward applications of near- term quantum devices for quantum machine learning.

Harnessing disordered quantum dynamics for machine learning

TLDR
A novel platform, quantum reservoir computing, is proposed to solve issues successfully by exploiting natural quantum dynamics, which is ubiquitous in laboratories nowadays, for machine learning by exploiting nonlinear dynamics including classical chaos.

Gaussian Boson sampling

TLDR
The protocol for Gaussian Boson Sampling with single-mode squeezed states is presented and it is shown that the proposal with the Hafnian matrix function can retain the higher photon number contributions at the input.

Quantum Machine Learning in Feature Hilbert Spaces.

TLDR
This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.

In Advances in Neural Information Processing Systems

Bill Baird { Publications References 1] B. Baird. Bifurcation analysis of oscillating neural network model of pattern recognition in the rabbit olfactory bulb. In D. 3] B. Baird. Bifurcation analysis

Information Theory

Information TheoryPapers read at a Symposium on Information Theory held at the Royal Institution, London, August 29th to September 2nd, 1960. Edited by Colin Cherry. Pp. xi + 476. (London:

Understanding Machine Learning - From Theory to Algorithms

TLDR
The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way in an advanced undergraduate or beginning graduate course.

The Nature Of Statistical Learning Theory

TLDR
As one of the part of book categories, the nature of statistical learning theory always becomes the most wanted book.

and C

  • K. Siew, in IEEE International Conference on Neural Networks - Conference Proceedings
  • 2004

and s