Corpus ID: 222132883

# No quantum speedup over gradient descent for non-smooth convex optimization

@article{Garg2020NoQS,
title={No quantum speedup over gradient descent for non-smooth convex optimization},
author={Ankit Garg and Robin Kothari and Praneeth Netrapalli and Suhail Sherif},
journal={ArXiv},
year={2020},
volume={abs/2010.01801}
}
• Ankit Garg, +1 author Suhail Sherif
• Published 2020
• Mathematics, Computer Science, Physics
• ArXiv
• We study the first-order convex optimization problem, where we have black-box access to a (not necessarily smooth) function $f:\mathbb{R}^n \to \mathbb{R}$ and its (sub)gradient. Our goal is to find an $\epsilon$-approximate minimum of $f$ starting from a point that is distance at most $R$ from the true minimum. If $f$ is $G$-Lipschitz, then the classic gradient descent algorithm solves this problem with $O((GR/\epsilon)^{2})$ queries. Importantly, the number of queries is independent of the… CONTINUE READING

#### References

SHOWING 1-10 OF 25 REFERENCES
Complexity of Highly Parallel Non-Smooth Convex Optimization
• Mathematics, Computer Science
• NeurIPS
• 2019
• 13
• PDF
Quantum Speed-Ups for Solving Semidefinite Programs
• Mathematics, Computer Science
• 2017 IEEE 58th Annual Symposium on Foundations of Computer Science (FOCS)
• 2017
• 62
• PDF
Quantum Algorithms for Learning Symmetric Juntas via the Adversary Bound
• A. Belovs
• Mathematics, Physics
• 2014 IEEE 29th Conference on Computational Complexity (CCC)
• 2014
• 18
• PDF
Lower Bounds for Parallel and Randomized Convex Optimization
• Mathematics, Computer Science
• COLT
• 2019
• 12
• PDF
Convex optimization using quantum oracles
• Computer Science, Physics
• ArXiv
• 2018
• 16
• PDF