• Corpus ID: 243985709

A Hierarchy for Replica Quantum Advantage

@article{Chen2021AHF,
  title={A Hierarchy for Replica Quantum Advantage},
  author={Sitan Chen and Jordan S. Cotler and Hsin-Yuan Huang and Jerry Zheng Li},
  journal={ArXiv},
  year={2021},
  volume={abs/2111.05874}
}
Sitan Chen, 2, ∗ Jordan Cotler, 4, † Hsin-Yuan Huang, 6, ‡ and Jerry Li § Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, Berkeley, CA, USA Simons Institute for the Theory of Computing, Berkeley, CA, USA Society of Fellows, Harvard University, Cambridge, MA, USA Black Hole Initiative, Harvard University, Cambridge, MA, USA Institute for Quantum Information and Matter, Caltech, Pasadena, CA, USA Department of Computing and Mathematical Sciences… 

Figures from this paper

The Complexity of NISQ

This work defines and study the complexity class NISQ, which is intended to encapsulate problems that can be efficiently solved by a classical computer with access to a N ISQ device, and considers the power of NISZ for three well-studied problems.

Quantum advantage in learning from experiments

This research presents a probabilistic model of the black hole that combines quantum entanglement, superposition, and superposition to describe the fabric of space-time.

Learning Quantum Processes and Hamiltonians via the Pauli Transfer Matrix

Learning about physical systems from quantum-enhanced experiments, relying on a quantum memory and quantum processing, can outperform learning from experiments in which only classical memory and

Unitary property testing lower bounds by polynomials

A generalized polynomial method for unitary property testing problems, leveraging connections with invariant theory, is applied to obtain lower bounds on problems such as determining recurrence times of unitaries, approximating the dimension of a marked subspace, and approximates the entanglement entropy of a marking state.

Lower bounds for learning quantum states with single-copy measurements

In the case of adaptive, single-copy measurements implementable with polynomial-size circuits, this rigorously establishes the optimality of the folklore “Pauli tomography” algorithm in terms of its sample complexity.

Information-theoretic Hardness of Out-of-time-order Correlators

The results provide a theoretical foundation for novel applications of OTOCs in quantum simulations and elucidate a general definition of time-ordered versus out-of-time-order experimental measurement protocols, which can be consid-ered as classes of adaptive quantum learning algorithms.

Tight Bounds for State Tomography with Incoherent Measurements

This work fully resolves the question of whether or not this rate of coherent measurements is tight, by showing that any protocol using incoherent measurements, even if they are chosen adaptively, requires Ω(d/ε) copies, matching the upper bound of [KRT17].

Tight Bounds for Quantum State Certification with Incoherent Measurements

The copy complexity of mixedness testing with incoherent measurements is settled and it is shown that $\Omega(d^{3/2}/\varepsilon^{2})$ copies are necessary and the instance-optimal bounds for state certification to general $\sigma$ first derived in [7] for non-adaptive measurements also hold for arbitrary incoherence measurements.

Challenges and opportunities in quantum machine learning

Current methods and applications for quantum machine learning are reviewed, including differences between quantum and classical machine learning, with a focus on quantum neural networks and quantum deep learning.

References

SHOWING 1-10 OF 27 REFERENCES

Measuring the Rényi entropy of a two-site Fermi-Hubbard model on a trapped ion quantum computer

This scalable measurement of entanglement on a universal quantum computer will, with more qubits, provide insights into many-body quantum systems that are impossible to simulate on classical computers.

Shadow tomography of quantum states

  • S. Aaronson
  • Computer Science
    Electron. Colloquium Comput. Complex.
  • 2017
Surprisingly, this work gives a procedure that solves the problem of shadow tomography by measuring only O( ε−5·log4 M·logD) copies, which means, for example, that the authors can learn the behavior of an arbitrary n-qubit state, on *all* accepting/rejecting circuits of some fixed polynomial size, by measuringonly nO( 1) copies of the state.

Improved Quantum data analysis

A quantum ”Threshold Search” algorithm that requires only O((log2 m)/є2) samples of a d-dimensional state ρ and a Shadow Tomography algorithm, which simultaneously achieves the best known dependence on each parameter m, d, є, which yields the same sample complexity for quantum Hypothesis Selection.

Quantum Computing in the NISQ era and beyond

Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future, and the 100-qubit quantum computer will not change the world right away - but it should be regarded as a significant step toward the more powerful quantum technologies of the future.

Local random quantum circuits are approximate polynomial-designs: numerical results

We numerically investigate the statement that local random quantum circuits acting on n qubits composed of polynomially many nearest-neighbor two-qubit gates form an approximate unitary

Quantum Virtual Cooling

We propose a quantum information based scheme to reduce the temperature of quantum many-body systems, and access regimes beyond the current capability of conventional cooling techniques. We show that

Entanglement is Necessary for Optimal Quantum Property Testing

It is shown that with independent measurements, $\Omega(d^{4/3}/\epsilon^{2})$ is necessary, even if the measurements are chosen adaptively, which resolves a question posed in [7].

Exponential Separations Between Learning With and Without Quantum Memory

It is proved that to estimate absolute values of all $n-qubit Pauli observables, algorithms with k < n qubits of quantum memory require at least $\Omega(2^{(n-k)/3})$ samples, but there is an algorithm using $n$-qu bit quantum memory which only requires $\mathcal{O}(n)$ samples.

Quantum algorithmic measurement

There has been recent promising experimental and theoretical evidence that quantum computational tools might enhance the precision and efficiency of physical experiments. However, a systematic

Quantum thermalization through entanglement in an isolated many-body system

Microscopy of an evolving quantum system indicates that the full quantum state remains pure, whereas thermalization occurs on a local scale, whereas entanglement creates local entropy that validates the use of statistical physics for local observables.