• Corpus ID: 238531441

QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks

@article{Qi2021QTNVQCAE,
  title={QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks},
  author={Jun Qi and Chao-Han Huck Yang and Pin-Yu Chen},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.03861}
}
The advent of noisy intermediate-scale quantum (NISQ) computers raises a crucial challenge to design quantum neural networks for fully quantum learning tasks. To bridge the gap, this work proposes an end-to-end learning framework named QTN-VQC, by introducing a trainable quantum tensor network (QTN) for quantum embedding on a variational quantum circuit (VQC). The architecture of QTN is composed of a parametric tensor-train network for feature extraction and a tensor product encoding for… 

Figures and Tables from this paper

Federated Quantum Natural Gradient Descent for Quantum Federated Learning

TLDR
An efficient learning algorithm, namely federated quantum natural gradient descent (FQNGD), applied in a QFL framework which consists of the variational quantum circuit (VQC)-based quantum neural networks (QNN).

Theoretical Error Performance Analysis for Variational Quantum Circuit Based Functional Regression

TLDR
This work puts forth an end-to-end quantum neural network, namely, TTN-VQC, which consists of a quantum tensor network based on a tensor-train network (TTN) for dimensionality reduction and a VQC for functional regression.

Optimal quantum kernels for small data classification

TLDR
An algorithm for constructing quantum kernels for support vector machines that adapts quantum gate sequences to data and the performance of the resulting quantum models for classification problems with a small number of training points significantly exceeds that of optimized classical models with conventional kernels.

When BERT Meets Quantum Temporal Convolution Learning for Text Classification in Heterogeneous Computing

TLDR
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification and shows that the proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.

Quantum Heterogeneous Distributed Deep Learning Architectures: Models, Discussions, and Applications

TLDR
The model structure studied so far and its possibilities and limitations to introduce and promote these studies are discussed and the areas of applied research sofar and in the future and the possibilities of new methodologies are discussed.

Classical-To-Quantum Transfer Learning for Spoken Command Recognition Based on Quantum Neural Networks

  • Jun QiJavier Tejedor
  • Computer Science, Physics
    ICASSP 2022 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2022
TLDR
This work investigates an extension of transfer learning applied in machine learning algorithms to the emerging hybrid end-to-end quantum neural network (QNN) for spoken command recognition (SCR), and results suggest that the hybrid transfer learning can boost the baseline performance on the SCR task.

Variational quantum reinforcement learning via evolutionary optimization

TLDR
A hybrid framework is proposed where the quantum RL agents are equipped with a hybrid tensor network-variational quantum circuit (TN-VQC) architecture to handle inputs of dimensions exceeding the number of qubits, enabling further quantum RL applications on noisy intermediate-scale quantum devices.

Generation of High-Resolution Handwritten Digits with an Ion-Trap Quantum Computer

TLDR
This work implements a quantum-circuit based generative model to sample the prior distribution of a Generative Adversarial Network (GAN), and introduces a multi-basis technique which leverages the unique possibility of measuring quantum states in different bases, hence enhancing the expressibility of the prior distributions to be learned.

References

SHOWING 1-10 OF 59 REFERENCES

Quantum embeddings for machine learning

TLDR
This work proposes to train the first part of the circuit with the objective of maximally separating data classes in Hilbert space, a strategy it calls quantum metric learning, which provides a powerful analytic framework for quantum machine learning.

SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <1MB model size

TLDR
This work proposes a small DNN architecture called SqueezeNet, which achieves AlexNet-level accuracy on ImageNet with 50x fewer parameters and is able to compress to less than 0.5MB (510x smaller than AlexNet).

Quantum Machine Learning in Feature Hilbert Spaces.

TLDR
This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.

Supervised Learning

Expressive power of parametrized quantum circuits

TLDR
It is proved that PQCs with a simple structure already outperform any classical neural network for generative tasks, unless the polynomial hierarchy collapses, and employed them as an application for Bayesian learning.

Quantum speed-ups in reinforcement learning

TLDR
This work quantizes the agent and the environment and grant them the possibility to also interact quantum-mechanically, that is, by using a quantum channel for their communication, and demonstrates that this feature enables a speed-up in the agent's learning process.

Inside quantum black boxes

On the face of it, characterizing quantum dynamics in the exponentially large Hilbert space of a many-body system might require prohibitively many experiments. In fact, the locality of physical

QuantumNAS: Noise-Adaptive Search for Robust Quantum Circuits

TLDR
Extensively evaluated with 12 quantum machine learning (QML) and variational quantum eigensolver (VQE) benchmarks on 14 quantum computers, QuantumNAS significantly outperforms noise-unaware search, human, random, and existing noise-adaptive qubit mapping baselines.

Highly accurate protein structure prediction with AlphaFold

TLDR
This work validated an entirely redesigned version of the neural network-based model, AlphaFold, in the challenging 14th Critical Assessment of protein Structure Prediction (CASP14)15, demonstrating accuracy competitive with experiment in a majority of cases and greatly outperforming other methods.

Quantum agents in the Gym: a variational quantum algorithm for deep Q-learning

TLDR
A training method for parametrized quantum circuits (PQCs) that can be used to solve RL tasks for discrete and continuous state spaces based on the deep Q-learning algorithm and shows when recent separation results between classical and quantum agents for policy gradient RL can be extended to inferring optimal Q-values in restricted families of environments.
...