• Corpus ID: 238531441

QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks

@article{Qi2021QTNVQCAE,
  title={QTN-VQC: An End-to-End Learning framework for Quantum Neural Networks},
  author={Jun Qi and Chao-Han Huck Yang and Pin-Yu Chen},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.03861}
}
The advent of noisy intermediate-scale quantum (NISQ) computers raises a crucial challenge to design quantum neural networks for fully quantum learning tasks. To bridge the gap, this work proposes an end-to-end learning framework named QTN-VQC, by introducing a trainable quantum tensor network (QTN) for quantum embedding on a variational quantum circuit (VQC). The architecture of QTN is composed of a parametric tensor-train network for feature extraction and a tensor product encoding for… 

Figures and Tables from this paper

Theoretical Error Performance Analysis for Variational Quantum Circuit Based Functional Regression
TLDR
This work puts forth an end-to-end quantum neural network, namely, TTN-VQC, which consists of a quantum tensor network based on a tensor-train network (TTN) for dimensionality reduction and a VQC for functional regression.
Classical-to-Quantum Transfer Learning for Spoken Command Recognition Based on Quantum Neural Networks
TLDR
This work investigates an extension of transfer learning applied in machine learning algorithms to the emerging hybrid end-to-end quantum neural network (QNN) for spoken command recognition (SCR), and results suggest that the hybrid transfer learning can boost the baseline performance on the SCR task.
Variational quantum reinforcement learning via evolutionary optimization
TLDR
A hybrid framework is proposed where the quantum RL agents are equipped with a hybrid tensor network-variational quantum circuit (TN-VQC) architecture to handle inputs of dimensions exceeding the number of qubits, enabling further quantum RL applications on noisy intermediate-scale quantum devices.
Generation of High-Resolution Handwritten Digits with an Ion-Trap Quantum Computer
TLDR
This work implements a quantum-circuit based generative model to sample the prior distribution of a Generative Adversarial Network (GAN), and introduces a multi-basis technique which leverages the unique possibility of measuring quantum states in different bases, hence enhancing the expressibility of the prior distributions to be learned.
When BERT Meets Quantum Temporal Convolution Learning for Text Classification in Heterogeneous Computing
TLDR
This work proposes a vertical federated learning architecture based on variational quantum circuits to demonstrate the competitive performance of a quantum-enhanced pre-trained BERT model for text classification and shows that the proposed BERT-QTC model attains competitive experimental results in the Snips and ATIS spoken language datasets.
Optimal quantum kernels for small data classification
TLDR
An algorithm for constructing quantum kernels for support vector machines that adapts quantum gate sequences to data and the performance of the resulting quantum models for classification problems with a small number of training points significantly exceeds that of optimized classical models with conventional kernels.
Quantum Distributed Deep Learning Architectures: Models, Discussions, and Applications
TLDR
This paper compares several model structures for QDDL and discusses their possibilities and limitations to leverage Q DDL for some representative application scenarios.
Quantum Heterogeneous Distributed Deep Learning Architectures: Models, Discussions, and Applications
TLDR
The model structure studied so far and its possibilities and limitations to introduce and promote these studies are discussed and the areas of applied research sofar and in the future and the possibilities of new methodologies are discussed.

References

SHOWING 1-10 OF 64 REFERENCES
Hybrid quantum-classical classifier based on tensor network and variational quantum circuit
TLDR
A hybrid model combining the quantum-inspired tensor networks (TN) and the variational quantum circuits (VQC) to perform supervised learning tasks, which allows for an end-to-end training and shows that a matrix product state based TN with low bond dimensions performs better than PCA as a feature extractor to compress data for the input of VQCs in the binary classification of MNIST dataset.
Towards quantum machine learning with tensor networks
TLDR
A unified framework is proposed in which classical and quantum computing can benefit from the same theoretical and algorithmic developments, and the same model can be trained classically then transferred to the quantum setting for additional optimization.
Quantum embeddings for machine learning
TLDR
This work proposes to train the first part of the circuit with the objective of maximally separating data classes in Hilbert space, a strategy it calls quantum metric learning, which provides a powerful analytic framework for quantum machine learning.
Quantum agents in the Gym: a variational quantum algorithm for deep Q-learning
TLDR
A training method for parametrized quantum circuits (PQCs) that can be used to solve RL tasks for discrete and continuous state spaces based on the deep Q-learning algorithm and shows when recent separation results between classical and quantum agents for policy gradient RL can be extended to inferring optimal Q-values in restricted families of environments.
Variational Quantum Circuits for Deep Reinforcement Learning
TLDR
This work reshapes classical deep reinforcement learning algorithms like experience replay and target network into a representation of variational quantum circuits, and uses a quantum information encoding scheme to reduce the number of model parameters compared to classical neural networks.
Differentiable Learning of Quantum Circuit Born Machine
TLDR
This work devise an efficient gradient-based learning algorithm for the quantum circuit Born machine by minimizing the kerneled maximum mean discrepancy loss and simulated generative modeling of the Bars-and-Stripes dataset and Gaussian mixture distributions using deep quantum circuits.
Quantum Algorithms for Deep Convolutional Neural Networks
TLDR
A new quantum tomography algorithm with $\ell_{\infty}$ norm guarantees, and new applications of probabilistic sampling in the context of information processing and numerical simulations for the classification of the MNIST dataset are presented.
Tensor-To-Vector Regression for Multi-Channel Speech Enhancement Based on Tensor-Train Network
TLDR
TTN maintains DNN’s expressive power yet involves a much smaller amount of trainable parameters and can attain speech enhancement quality comparable with that for DNN but with much fewer parameters, e.g., a reduction from 27 million to only 5 million parameters is observed in a single-channel scenario.
Quantum Machine Learning in Feature Hilbert Spaces.
TLDR
This Letter interprets the process of encoding inputs in a quantum state as a nonlinear feature map that maps data to quantum Hilbert space and shows how it opens up a new avenue for the design of quantum machine learning algorithms.
Decentralizing Feature Extraction with Quantum Convolutional Neural Network for Automatic Speech Recognition
  • C. Yang, Jun Qi, Chin-Hui Lee
  • Computer Science
    ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • 2021
TLDR
An in-depth study of different quantum circuit encoder architectures is conducted to provide insights into designing QCNN-based feature extractors and demonstrates a high correlation between the proposed QCNN features, class activation maps, and the input Mel-spectrogram.
...
...