Benchmarking treewidth as a practical component of tensor network simulations

@article{Dumitrescu2018BenchmarkingTA,
  title={Benchmarking treewidth as a practical component of tensor network simulations},
  author={Eugene F. Dumitrescu and Allison L. Fisher and Timothy Goodrich and T. Humble and Blair D. Sullivan and Andrew L. Wright},
  journal={PLoS ONE},
  year={2018},
  volume={13}
}
Tensor networks are powerful factorization techniques which reduce resource requirements for numerically simulating principal quantum many-body systems and algorithms. The computational complexity of a tensor network simulation depends on the tensor ranks and the order in which they are contracted. Unfortunately, computing optimal contraction sequences (orderings) in general is known to be a computationally difficult (NP-complete) task. In 2005, Markov and Shi showed that optimal contraction… 

Figures and Tables from this paper

Hyper-optimized tensor network contraction
TLDR
This work implements new randomized protocols that find very high quality contraction paths for arbitrary and large tensor networks, and introduces a hyper-optimization approach, where both the method applied and its algorithmic parameters are tuned during the path finding.
Algorithms for tensor network contraction ordering
TLDR
The performance of simulated annealing and genetic algorithms, two common discrete optimization techniques, to this ordering problem are explored and it is found that the algorithms considered consistently outperform a greedy search given equal computational resources.
Efficient Contraction of Large Tensor Networks for Weighted Model Counting through Graph Decompositions
TLDR
It is proved that finding an efficient contraction order for a tensor network is equivalent to the well-known problem of finding an optimal carving decomposition, and memory-optimal contraction orders for planar tensor networks can be found in cubic time.
Contracting Arbitrary Tensor Networks: General Approximate Algorithm and Applications in Graphical Models and Quantum Circuit Simulations.
TLDR
The method is able to simulate large quantum circuits that are out of reach of the state-of-the-art simulation methods and largely outperforms existing algorithms, including the mean-field methods and the recently proposed neural-network-based methods.
Parameterization of Tensor Network Contraction
TLDR
A conceptually clear and algorithmically useful framework for parameterizing the costs of tensor network contraction, applying to tensor networks with arbitrary bond dimensions, open legs, and hyperedges and showing how contraction trees relate to existing tree-like objects in the graph theory literature.
Parallel Weighted Model Counting with Tensor Networks
TLDR
This work explores the impact of multi-core and GPU use on tensor-network contraction for weighted model counting and compares the resulting weighted model counter on 1914 standard weighted model count benchmarks and shows that it significantly improves the virtual best solver.
Tropical Tensor Network for Ground States of Spin Glasses.
TLDR
The approach brings together the concepts from graphical models, tensor networks, differentiable programming, and quantum circuit simulation, and easily utilizes the computational power of graphical processing units (GPUs).
Carving-width and contraction trees for tensor networks
TLDR
The Ratcatcher of Seymour and Thomas is implemented for determining the carving-width of planar networks, in order to offer experimental evidence that this measure of spatial complexity makes a generally effective heuristic for limiting their total contraction time.
An entanglement perspective on the quantum approximate optimization algorithm
TLDR
The QAOA algorithm for solving the paradigmatic Max-Cut problem on different types of graphs is considered and it is found that there is a volume-law entanglement barrier between the initial and final states.
Calibrating the classical hardness of the quantum approximate optimization algorithm
TLDR
The fidelity for the quantum approximate optimization algorithm is characterized by the expectation value of the cost function it seeks to minimize and it is found that it follows a scaling law F (ln χ/N ) with N the number of qubits.
...
...

References

SHOWING 1-10 OF 40 REFERENCES
qTorch: The quantum tensor contraction handler
TLDR
The results in this work suggest that tensor contraction methods are superior only when simulating Max-Cut/QAOA with graphs of regularities approximately five and below, and that the stochastic contraction method outperforms the line graph based method only when the time to calculate a reasonable tree decomposition is prohibitively expensive.
Faster identification of optimal contraction sequences for tensor networks.
TLDR
A modified search algorithm with enhanced pruning is presented which exhibits a performance increase of several orders of magnitude while still guaranteeing identification of an optimal operation-minimizing contraction sequence for a single tensor network.
Improving the efficiency of variational tensor network algorithms
TLDR
The usefulness of several results relating to the contraction of generic tensor networks is demonstrated for the optimization of a multiscale entanglement renormalization Ansatz for the ground state of a one-dimensional quantum system, where they are shown to substantially reduce the computation time.
Tree tensor network approach to simulating Shor's algorithm
Simulating quantum systems constructively furthers our understanding of qualitative and quantitative features which may be analytically intractable. In this letter, we directly simulate and explore
Tensor Networks in a Nutshell
TLDR
This tutorial concludes the tutorial with tensor contractions evaluating combinatorial counting problems and Penrose's tensor contraction algorithm, returning the number of edge-colorings of regular planar graphs.
Simulating Quantum Computation by Contracting Tensor Networks
TLDR
It is proved that a quantum circuit with T gates whose underlying graph has a treewidth d can be simulated deterministically in T^{O(1)}\exp[O(d)]$ time, which, in particular, is polynomial in $T$ if d=O(\log T)$.
Tensor Network Contractions for #SAT
TLDR
This study increases the theory, expressiveness and application of tensor based algorithmic tools and provides an alternative insight on these problems which have a long history in statistical physics and computer science.
NCON: A tensor network contractor for MATLAB
TLDR
This article presents a MATLAB function ncon(), or "Network CONtractor", which accepts as its input a tensor network and a contraction sequence describing how this network may be reduced to a single tensor or number.
Holographic quantum error-correcting codes: toy models for the bulk/boundary correspondence
TLDR
That bulk logical operators can be represented on multiple boundary regions mimics the Rindlerwedge reconstruction of boundary operators from bulk operators, realizing explicitly the quantum error-correcting features of AdS/CFT recently proposed in [1].
Characterizing quantum supremacy in near-term devices
A critical question for quantum computing in the near future is whether quantum devices without error correction can perform a well-defined computational task beyond the capabilities of
...
...