Corpus ID: 6231988

Using More Data to Speed-up Training Time

@article{ShalevShwartz2012UsingMD,
  title={Using More Data to Speed-up Training Time},
  author={S. Shalev-Shwartz and O. Shamir and Eran Tromer},
  journal={ArXiv},
  year={2012},
  volume={abs/1106.1216}
}
In many recent applications, data is plentiful. By now, we have a rather clear understanding of how more data can be used to improve the accuracy of learning algorithms. Recently, there has been a growing interest in understanding how more data can be leveraged to reduce the required training runtime. In this paper, we study the runtime of learning as a function of the number of available training examples, and underscore the main high-level techniques. We provide some initial positive results… Expand
29 Citations
More data speeds up training time in learning halfspaces over sparse vectors
  • 41
  • PDF
Computational Trade-offs in Statistical Learning
  • 15
  • PDF
Improper Deep Kernels
  • 11
  • PDF
The computational power of optimization in online learning
  • 30
On statistics, computation and scalability
  • 85
  • PDF
Open Problem: The Statistical Query Complexity of Learning Sparse Halfspaces
  • 7
  • PDF
On-Device Machine Learning: An Algorithms and Learning Theory Perspective
  • 16
  • PDF
...
1
2
3
...

References

SHOWING 1-10 OF 29 REFERENCES
Efficient Learning of Linear Perceptrons
  • 33
  • PDF
Computational sample complexity and attribute-efficient learning
  • 32
  • Highly Influential
  • PDF
Computational sample complexity
  • 18
  • Highly Influential
On the generalization ability of on-line learning algorithms
  • 482
  • PDF
Computational Sample Complexity and Attribute-Efficient Learning
  • 12
  • Highly Influential
Toward efficient agnostic learning
  • 212
  • PDF
Newtron: an Efficient Bandit algorithm for Online Multiclass Prediction
  • 34
  • PDF
Learning Kernel-Based Halfspaces with the Zero-One Loss
  • 22
  • PDF
Computational limitations on learning from examples
  • 558
  • PDF
...
1
2
3
...