Matrix Product State–Based Quantum Classifier

@article{Bhatia2019MatrixPS,
  title={Matrix Product State–Based Quantum Classifier},
  author={Amandeep Singh Bhatia and Mandeep Kaur Saggi and Ajay Kumar and Sushma Jain},
  journal={Neural Computation},
  year={2019},
  volume={31},
  pages={1499-1517}
}
Interest in quantum computing has increased significantly. Tensor network theory has become increasingly popular and widely used to simulate strongly entangled correlated systems. Matrix product state (MPS) is a well-designed class of tensor network states that plays an important role in processing quantum information. In this letter, we show that MPS, as a one-dimensional array of tensors, can be used to classify classical and quantum data. We have performed binary classification of the… 
Tree-tensor-network classifiers for machine learning: From quantum inspired to quantum assisted
TLDR
It is shown that the use of isometric tensors can significantly aid in the human interpretability of the correlation functions extracted from the decision weights, and may produce models that are less susceptible to adversarial perturbations.
Classical versus Quantum: comparing Tensor Network-based Quantum Circuits on LHC data
TLDR
This study provides a comprehensive comparison between classical TNs and TN-inspired quantum circuits in the context of Machine Learning on highly complex, simulated LHC data and shows that Classical TNs require exponentially large bond dimensions and higher Hilbert-space mapping to perform comparably to their quantum counterparts.
Quantum algorithm for neural network enhanced multi-class parallel classification
Using the properties of quantum superposition, we propose a quantum classification algorithm to efficiently perform multi-class classification tasks, where the training data are loaded into
VSQL: Variational Shadow Quantum Learning for Classification
TLDR
This paper utilizes the classical shadows of quantum data to extract classical features in a convolution way and then utilize a fully-connected neural network to complete the classification task and shows that this method could sharply reduce the number of parameters and thus better facilitate quantum circuit training.
Deep convolutional tensor network
TLDR
A novel Tensor network based model called Deep convolutional tensor network (DCTN) for image classification, which has parameter sharing, locality, and deepness, and it is based on the Entangled plaquette states (EPS) tensornetwork.
Quantum-Classical Machine learning by Hybrid Tensor Networks
TLDR
This work proposes the quantum-classical hybrid tensor networks (HTN) which combine Tensor networks with classical neural networks in a uniform deep learning framework to overcome the limitations of regular tensor Networks in machine learning.
Variational quantum classifiers through the lens of the Hessian
TLDR
The Hessian is calculated, the curvature information of variational quantum classifiers (VQC) is interpreted and the loss function’s convergence is shown, and it is shown how the adaptive Hessian learning rate can influence the convergence while training the variational circuits.
Recent Progress in Quantum Machine Learning
TLDR
The objective of this chapter is to facilitate the reader to grasp the key components involved in the field to be able to understand the essentialities of the subject and thus can compare computations of quantum computing with its counterpart classical machine learning algorithms.
Gradient-Free optimization algorithm for single-qubit quantum classifier
TLDR
Simulation results show that the single-qubit quantum classifier with proposed gradient-free optimization algorithm can reach a high accuracy faster than that using Adam optimizer and has a good performance in noisy environments.
The Presence and Absence of Barren Plateaus in Tensor-network Based Machine Learning
TLDR
This work rigorously proves that barren plateaus prevail in the training process of the machine learning algorithms with global loss functions, and reveals a crucial aspect of tensor-network based machine learning in a rigorous fashion.
...
...

References

SHOWING 1-10 OF 27 REFERENCES
Hierarchical quantum classifiers
TLDR
It is shown how quantum algorithms based on two tensor network structures can be used to classify both classical and quantum data and may enable classification of two-dimensional images and entangled quantum data more efficiently than is possible with classical methods.
Neurocomputing approach to matrix product state using quantum dynamics
TLDR
This paper has used the proposed unitary criteria to investigate the dynamics of matrix product state with quantum weightless neural networks, where the output qubit is extracted and fed back (iterated) to input.
Entanglement-guided architectures of machine learning by quantum tensor network
TLDR
This work implements simple numerical experiments, related to pattern/images classification, in which the classifiers are represented by many-qubit quantum states written in the matrix product states (MPS).
Towards quantum machine learning with tensor networks
TLDR
A unified framework is proposed in which classical and quantum computing can benefit from the same theoretical and algorithmic developments, and the same model can be trained classically then transferred to the quantum setting for additional optimization.
Quantum Machine Learning Matrix Product States
TLDR
A quantum algorithm is presented which returns a classical description of a matrix product state approximating an eigenvector given black-box access to a unitary matrix, yielding sufficient conditions for the quantum variational algorithm to terminate in polynomial-time.
Simulation of Matrix Product State on a Quantum Computer
TLDR
This paper simulated the matrix product states in customized IBMs (2-qubit, 3-qu bit and 4-qubits) quantum systems and determined the probability distribution among the quantum states.
Classification with Quantum Neural Networks on Near Term Processors
TLDR
This work introduces a quantum neural network, QNN, that can represent labeled data, classical or quantum, and be trained by supervised learning, and shows through classical simulation that parameters can be found that allow the QNN to learn to correctly distinguish the two data sets.
Neural-Network Quantum States, String-Bond States, and Chiral Topological States
TLDR
The results demonstrate the efficiency of neural networks to describe complex quantum wave functions and pave the way towards the use of String-Bond States as a tool in more traditional machine-learning applications.
Supervised Learning with Quantum-Inspired Tensor Networks
TLDR
It is demonstrated how algorithms for optimizing such networks can be adapted to supervised learning tasks by using matrix product states (tensor trains) to parameterize models for classifying images.
Efficient representation of quantum many-body states with deep neural networks
TLDR
A proof that, assuming a widely believed computational complexity conjecture, a deep neural network can efficiently represent most physical states, including the ground states of many-body Hamiltonians and states generated by quantum dynamics, while a shallow network representation with a restricted Boltzmann machine cannot efficiently represent some of those states.
...
...