# Sublinear Least-Squares Value Iteration via Locality Sensitive Hashing

@article{Shrivastava2021SublinearLV, title={Sublinear Least-Squares Value Iteration via Locality Sensitive Hashing}, author={Anshumali Shrivastava and Zhao Song and Zhaozhuo Xu}, journal={ArXiv}, year={2021}, volume={abs/2105.08285} }

We present the first provable Least-Squares Value Iteration (LSVI) algorithms that achieves runtime complexity sublinear in the number of actions. We formulate the value function estimation procedure in value iteration as an approximate maximum inner product search problem and propose a locality sensitive hashing (LSH) [Indyk and Motwani STOC’98, Andoni and Razenshteyn STOC’15, Andoni, Laarhoven, Razenshteyn and Waingarten SODA’17] type data structure to solve this problem with sublinear time…

## 7 Citations

### Breaking the Linear Iteration Cost Barrier for Some Well-known Conditional Gradient Methods Using MaxIP Data-structures

- Computer ScienceNeurIPS
- 2021

This work provides a formal framework to combine the locality sensitive hashing type approximate MaxIP data-structures with CGM algorithms, and shows the first algorithm, where the cost per iteration is sublinear in the number of parameters, for many fundamental optimization algorithms, e.g., Frank-Wolfe, Herding algorithm, and policy gradient.

### Sublinear Time Algorithm for Online Weighted Bipartite Matching

- Computer ScienceArXiv
- 2022

This work provides the theoretical foundation for computing the weights approximately and shows that, with the proposed randomized data structures, the weights can be computed in sublinear time while still preserving the competitive ratio of the matching algorithm.

### A Dynamic Fast Gaussian Transform

- Computer ScienceArXiv
- 2022

The main result is an efficient dynamic FGT algorithm, supporting the following operations in log(n/ε) time: Adding or deleting a source point, and Estimating the “kerneldensity” of a query point with respect to sources with ε additive accuracy.

### Fast Distance Oracles for Any Symmetric Norm

- Computer ScienceArXiv
- 2022

The main contribution is a fast (1 + ε) distance oracle for any symmetric norm ‖ · ‖l, which includes lp norms and Orlicz norms as special cases, as well as other norms used in practice, e.g. top-k norms, max-mixture and sum-mixtures of lpnorms, small-support norms and the boxnorm.

### Dynamic Maintenance of Kernel Density Estimation Data Structure: From Practice to Theory

- Computer ScienceArXiv
- 2022

This work focuses on the dynamic maintenance of KDE data structures with robustness to adversarial queries, and provides a theoretical framework of Plasma data structures that supports the dynamic update of the dataset in sublinear time.

### A Sublinear Adversarial Training Algorithm

- Computer ScienceArXiv
- 2022

This paper analyzes the convergence guarantee of adversarial training procedure on a two-layer neural network with shifted ReLU activation, and shows that only o ( m ) neurons will be activated for each input data per iteration, and develops an algorithm for adversarialTraining with time cost o (mnd ) per iteration by applying half-space reporting data structure.

### Accelerating Frank-Wolfe Algorithm using Low-Dimensional and Adaptive Data Structures

- Computer ScienceArXiv
- 2022

This paper develops and employs two novel inner product search data structures that improve the prior fastest algorithm in NeurIPS 2021, speeding up a type of optimization algorithms called Frank-Wolfe.

## References

SHOWING 1-10 OF 95 REFERENCES

### Asymmetric LSH (ALSH) for Sublinear Time Maximum Inner Product Search (MIPS)

- Computer ScienceNIPS
- 2014

We present the first provably sublinear time algorithm for approximate \emph{Maximum Inner Product Search} (MIPS). Our proposal is also the first hashing algorithm for searching with (un-normalized)…

### Linear Bandit Algorithms with Sublinear Time Complexity

- Computer ScienceICML
- 2022

Two linear bandits algorithms with per-step complexity sublinear in the number of arms K are proposed, which can deliver a more than 72 times speedup compared to the linear time baselines while retaining similar regret.

### LSH Forest: Practical Algorithms Made Theoretical

- Computer ScienceSODA
- 2017

The end result is the first instance of a simple, practical algorithm that provably leverages data-dependent hashing to improve upon data-oblivious LSH, and is provably better than the best LSH algorithm for the Hamming space.

### High-dimensional similarity search and sketching: algorithms and hardness

- Computer Science
- 2017

An algorithm for the ANN problem over the l1 and l2 distances that improves upon the Locality-Sensitive Hashing framework and establishes the equivalence between the existence of short and accurate sketches and good embeddings into lp spaces for 0 < p ≤ 2.

### Oblivious Sketching-based Central Path Method for Solving Linear Programming Problems

- Computer Science, Mathematics
- 2020

This work proposes a sketching-based central path method for solving linear programmings, whose running time matches the state of art results Cohen et al. (2019b) but can use sparse sketching matrix Nelson & Nguyên (2013) to speed up the online matrix-vector multiplication.

### Locality-sensitive hashing scheme based on p-stable distributions

- Computer ScienceSCG '04
- 2004

A novel Locality-Sensitive Hashing scheme for the Approximate Nearest Neighbor Problem under lp norm, based on p-stable distributions that improves the running time of the earlier algorithm and yields the first known provably efficient approximate NN algorithm for the case p<1.

### Norm-Ranging LSH for Maximum Inner Product Search

- Computer ScienceNeurIPS
- 2018

It is proved that NORM-RANGING LSH achieves lower query time complexity than SIMPLE-LSH under mild conditions and the idea of dataset partitioning can improve another hashing based MIPS algorithm.

### MONGOOSE: A Learnable LSH Framework for Efficient Neural Network Training

- Computer ScienceICLR
- 2021

MONGOOSE is equipped with a scheduling algorithm to adaptively perform LSH updates with provable guarantees and learnable hash functions to improve query efficiency and is validated on large-scale deep learning models for recommendation systems and language modeling.

### Practical and Optimal LSH for Angular Distance

- Computer ScienceNIPS
- 2015

This work shows the existence of a Locality-Sensitive Hashing (LSH) family for the angular distance that yields an approximate Near Neighbor Search algorithm with the asymptotically optimal running time exponent and establishes a fine-grained lower bound for the quality of any LSH family for angular distance.

### Improved Asymmetric Locality Sensitive Hashing (ALSH) for Maximum Inner Product Search (MIPS)

- Computer ScienceUAI
- 2015

Theoretical analysis and experimental evaluations show that the new scheme is significantly better than the original scheme for MIPS and can be efficiently solved using signed random projections.