• Corpus ID: 250144738

GEO: Enhancing Combinatorial Optimization with Classical and Quantum Generative Models

@inproceedings{Alcazar2021GEOEC,
  title={GEO: Enhancing Combinatorial Optimization with Classical and Quantum Generative Models},
  author={Javier Alcazar and Mohammad Ghazi Vakili and Can Berk Kalayci and Alejandro Perdomo-Ortiz},
  year={2021}
}
We introduce a new framework that leverages machine learning models known as generative models to solve optimization problems. Our Generator-Enhanced Optimization (GEO) strategy is flexible to adopt any generative model, from quantum to quantum-inspired or classical, such as Generative Adversarial Networks, Variational Autoencoders, or Quantum Circuit Born Machines, to name a few. Here, we focus on a quantum-inspired version of GEO relying on tensor-network Born machines, and referred to… 
3 Citations

Figures and Tables from this paper

Decomposition of Matrix Product States into Shallow Quantum Circuits

TLDR
This work compares a range of novel and previously-developed algorithmic protocols for decomposing matrix product states of arbitrary bond dimension into low-depth quantum circuits consisting of stacked linear layers of two-qubit unitaries and proposes a proposed decomposition protocol to form a useful ingredient within any joint application of TNs and PQCs.

Do Quantum Circuit Born Machines Generalize?

TLDR
This work investigates the QCBM’s learning process of a cardinality-constrained distribution and sees an increase in generalization performance while increasing the circuit depth, and demonstrates the QCBMs’ ability to generalize to high-quality, desired novel samples.

Generalization and Overfitting in Matrix Product State Machine Learning Architectures

TLDR
It is speculated that generalization properties of MPS depend on the properties of data: with one-dimensional data (for which the MPS ansatz is the most suitable) MPS is prone to overfitting, while with more complex data which cannot be parameterized by MPS exactly, over-tting may be much less signiflcant.

References

SHOWING 1-10 OF 42 REFERENCES

Unsupervised Generative Modeling Using Matrix Product States

TLDR
This work proposes a generative model using matrix product states, which is a tensor network originally proposed for describing (particularly one-dimensional) entangled quantum states, and enjoys efficient learning analogous to the density matrix renormalization group method.

A generative modeling approach for benchmarking and training shallow quantum circuits

TLDR
A quantum circuit learning algorithm that can be used to assist the characterization of quantum devices and to train shallow circuits for generative tasks is proposed and it is demonstrated that this approach can learn an optimal preparation of the Greenberger-Horne-Zeilinger states.

Tree Tensor Networks for Generative Modeling

TLDR
It is shown that the TTN is superior to MPSs for generative modeling in keeping the correlation of pixels in natural images, as well as giving better log-likelihood scores in standard data sets of handwritten digits.

Generation of High-Resolution Handwritten Digits with an Ion-Trap Quantum Computer

TLDR
This work implements a quantum-circuit based generative model to sample the prior distribution of a Generative Adversarial Network (GAN), and introduces a multi-basis technique which leverages the unique possibility of measuring quantum states in different bases, hence enhancing the expressibility of the prior distributions to be learned.

Tensor Networks for Probabilistic Sequence Modeling

TLDR
A novel generative algorithm is introduced giving trained u-MPS the ability to efficiently sample from a wide variety of conditional distributions, each one defined by a regular expression, which permits the generation of richly structured text in a manner that has no direct analogue in current generative models.

Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines

TLDR
It is found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems and estimates the classical mutual information of the standard MNIST datasets and the quantum Rényi entropy of corresponding Matrix Product States (MPS) representations.

Variational Neural Annealing

TLDR
It is shown that by generalizing the target distribution with a parameterized model, an analogous annealing framework based on the variational principle can be used to search for groundstate solutions in the asymptotic limit.

Modeling sequences with quantum states: a look under the hood

TLDR
An understanding of the extra information contained in the reduced densities allow us to examine the mechanics of this DMRG algorithm and study the generalization error of the resulting model.

Flow Network based Generative Models for Non-Iterative Diverse Candidate Generation

TLDR
GFlowNet is proposed, based on a view of the generative process as a flow network, making it possible to handle the tricky case where different trajectories can yield the same final state, e.g., there are many ways to sequentially add atoms to generate some molecular graph.

TensorNetwork for Machine Learning

TLDR
The encoding of image data into a matrix product state form is explained in detail, and how to contract the network in a way that is parallelizable and well-suited to automatic gradients for optimization is described.