Machine Learning Theory Seminar

Abstract

1 Deterministic Optimization For brevity, everywhere differentiable functions will be called smooth. Similarly, not everywhere differentiable functions will be called nonsmooth. First, we extend the Lipschitz continuity definition (Definition 6.4) to higher dimensions. 

Topics

Cite this paper

@inproceedings{HonorioMachineLT, title={Machine Learning Theory Seminar}, author={Jean Honorio} }